A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

113
A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES ACQ. SFTWR DEV. SERV--A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES EXECUTIVE SUMMARY The General Services Administration (GSA), Information Resources Management Service (IRMS), develops Governmentwide policies and guidance for automated data processing, records, and telecommunications. IRMS' objective is to ensure that Federal agencies acquire, manage, and use Federal information processing (FIP) resources that meet their requirements in an economical and efficient manner. This Guide provides Government program, information resources management, and contracting officials with an introduction to the acquisition of software development services. The Guide, written for readers unfamiliar with the Federal acquisition process for software development services, is not a complete reference work or a regulation and, therefore, directs the reader to other sources for detailed information and guidance. FOREWORD The General Services Administration (GSA) is issuing this Guide to assist Federal agencies in acquiring software development services. It should be used in conjunction with existing regulations. This is one of a series of acquisition guides under development by the Policy Analysis Division of the Information Resources Management Service (IRMS). The Federal Systems Integration and Management Center (FEDSIM) is providing technical assistance in developing the series. The GSA Project Manager was Gloria Sochon, with assistance from Robert Karr. GSA gratefully acknowledges Rex Bolton, U.S. Army, who assisted in the review of the Guide as a member of the Interagency Acquisition Guide Advisory Group, and the following GSA staff: Doug Arai, Diane Herdt, Judy Steele, and Jeff Tucker. To obtain copies of this publication or to be added to the GSA IRM Reference Center mailing list, please use the order request forms provided as the last pages of this document. We welcome your comments regarding this Guide and the Acquisition Guide Series. Please use the "Reader's Comments" sheet at the back of the Guide or contact the Policy Analysis Division at (202) 501- 2462 with your suggestions or questions. G. Martin Wagner Acting Commissioner Information Resources Management Service U.S. General Services Administration This Guide was written by American Management Systems, Inc. Arlington, Virginia TABLE OF CONTENTS Page 1. INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . 1-1 1.1 THE ACQUISITION GUIDE SERIES. . . . . . . . . . . . 1-1 a. Background. . . . . . . . . . . . . . . . . . . 1-1 b. Audiences . . . . . . . . . . . . . . . . . . . 1-1 c. Objectives. . . . . . . . . . . . . . . . . . . 1-2 1.2 SCOPE OF THE SOFTWARE DEVELOPMENT SERVICES GUIDE. . 1-3 a. The Complete Acquisition Life-Cycle . . . . . . 1-3 b. Limitations . . . . . . . . . . . . . . . . . . 1-5 1.3 ORGANIZATION OF THE SOFTWARE DEVELOPMENT SERVICES GUIDE. . . . . . . . . . . . . . . . . . . . . . . 1-5 a. General Information . . . . . . . . . . . . . . 1-5 b. Life-Cycle Focus. . . . . . . . . . . . . . . . 1-6 2. GENERAL INFORMATION . . . . . . . . . . . . . . . . . . . 2-1 2.1 DEFINITION OF KEY ACQUISITION TERMS . . . . . . . . 2-1 a. Federal Information Processing Resources. . . . 2-2 b. Acquisition . . . . . . . . . . . . . . . . . . 2-2 c. Requirements Analysis . . . . . . . . . . . . . 2-3 d. Analysis of Alternatives. . . . . . . . . . . . 2-3 e. Acquisition Strategy. . . . . . . . . . . . . . 2-4 2.2 WHAT ARE SOFTWARE DEVELOPMENT SERVICES? . . . . . . 2-4 a. Software. . . . . . . . . . . . . . . . . . . . 2-4 (1) Already-existing Software . . . . . . . . 2-7 (2) Custom-developed Software . . . . . . . . 2-8 b. Scope of Software Development Services Acquisitions. . . . . . . . . . . . . . . . .2-11 (1) Specific Services That Could be Included in a Software Development Services Acquisition. . . . . . . . . . . . . . .2-11 (2)

Transcript of A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Page 1: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES ACQ. SFTWR DEV.SERV--A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

A GUIDE FOR ACQUIRING SOFTWAREDEVELOPMENT SERVICES

EXECUTIVE SUMMARYThe General Services Administration (GSA), Information Resources Management Service (IRMS),develops Governmentwide policies and guidance for automated data processing, records, andtelecommunications. IRMS' objective is to ensure that Federal agencies acquire, manage, and useFederal information processing (FIP) resources that meet their requirements in an economical andefficient manner. This Guide provides Government program, information resources management, andcontracting officials with an introduction to the acquisition of software development services. TheGuide, written for readers unfamiliar with the Federal acquisition process for software developmentservices, is not a complete reference work or a regulation and, therefore, directs the reader to othersources for detailed information and guidance.

FOREWORDThe General Services Administration (GSA) is issuing this Guide to assist Federal agencies inacquiring software development services. It should be used in conjunction with existing regulations.This is one of a series of acquisition guides under development by the Policy Analysis Division of theInformation Resources Management Service (IRMS). The Federal Systems Integration andManagement Center (FEDSIM) is providing technical assistance in developing the series. The GSAProject Manager was Gloria Sochon, with assistance from Robert Karr. GSA gratefully acknowledgesRex Bolton, U.S. Army, who assisted in the review of the Guide as a member of the InteragencyAcquisition Guide Advisory Group, and the following GSA staff: Doug Arai, Diane Herdt, JudySteele, and Jeff Tucker. To obtain copies of this publication or to be added to the GSA IRM ReferenceCenter mailing list, please use the order request forms provided as the last pages of this document. Wewelcome your comments regarding this Guide and the Acquisition Guide Series. Please use the"Reader's Comments" sheet at the back of the Guide or contact the Policy Analysis Division at (202)501- 2462 with your suggestions or questions. G. Martin Wagner Acting Commissioner InformationResources Management Service U.S. General Services Administration This Guide was written byAmerican Management Systems, Inc. Arlington, Virginia

TABLE OF CONTENTSPage 1. INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . . 1-1 1.1 THE ACQUISITION GUIDE SERIES.. . . . . . . . . . . 1-1 a. Background. . . . . . . . . . . . . . . . . . . 1-1 b. Audiences . . . . . . . . . . . . . . . . . . . 1-1c. Objectives. . . . . . . . . . . . . . . . . . . 1-2 1.2 SCOPE OF THE SOFTWARE DEVELOPMENTSERVICES GUIDE. . 1-3 a. The Complete Acquisition Life-Cycle . . . . . . 1-3 b. Limitations . . . . . . . .. . . . . . . . . . 1-5 1.3 ORGANIZATION OF THE SOFTWARE DEVELOPMENT SERVICES GUIDE.. . . . . . . . . . . . . . . . . . . . . . 1-5 a. General Information . . . . . . . . . . . . . . 1-5 b. Life-Cycle Focus. . . . .. . . . . . . . . . . 1-6 2. GENERAL INFORMATION . . . . . . . . . . . . . . . . . . . 2-1 2.1 DEFINITION OFKEY ACQUISITION TERMS . . . . . . . . 2-1 a. Federal Information Processing Resources. . . . 2-2 b.Acquisition . . . . . . . . . . . . . . . . . . 2-2 c. Requirements Analysis . . . . . . . . . . . . . 2-3 d. Analysis ofAlternatives. . . . . . . . . . . . 2-3 e. Acquisition Strategy. . . . . . . . . . . . . . 2-4 2.2 WHAT ARESOFTWARE DEVELOPMENT SERVICES? . . . . . . 2-4 a. Software. . . . . . . . . . . . . . . . . . . . 2-4 (1)Already-existing Software . . . . . . . . 2-7 (2) Custom-developed Software . . . . . . . . 2-8 b. Scope ofSoftware Development Services Acquisitions. . . . . . . . . . . . . . . . .2-11 (1) Specific Services ThatCould be Included in a Software Development Services Acquisition. . . . . . . . . . . . . . .2-11 (2)

Page 2: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Software Development Services Acquisitions vs. Systems Integration Services Acquisitions. . . . . . . . ..2-11 c. Relation of Software Development Services to Other FIP Resources . . . . . . . . . . . . .2-12 d.Relation of This Guide to Others in the Acquisition Guide Series. . . . . . . . . . .2-12 2.3 ROLES ANDRESPONSIBILITIES. . . . . . . . . . . . .2-13 a. End User. . . . . . . . . . . . . . . . . . . .2-15 b. ProgramManager . . . . . . . . . . . . . . . .2-15 c. Information Resources Management/Technical Representative. . . .. . . . . . . . . . . .2-16 d. Contracting Officer . . . . . . . . . . . . . .2-17 e. Contracting Officer's TechnicalRepresentative. . . . . . . . . . . . . . . .2-18 2.4 KEYS TO SUCCESSFUL ACQUISITIONS . . . . . . . . . .2-19 a. Define Requirements Functionally. . . . . . . .2-19 b. Obtain Full and Open Competition. . . . . . ..2-21 c. Guard Procurement-Sensitive Information . . . .2-22 d. Identify, Agree on, and DocumentAssumptions and Constraints . . . . . . . . . . . . . . .2-23 e. Involve All Three Staff Disciplines . . . . . .2-24f. Consider Agency-Specific Requirements or Restrictions. . . . . . . . . . . . . . . . .2-25 g. Begin theAcquisition Process Early . . . . . .2-25 2.5 PROTESTS. . . . . . . . . . . . . . . . . . . . . .2-27 3. SPECIALCHARACTERISTICS OF SOFTWARE DEVELOPMENT SERVICES. . . . . . . . . . . . . . . . . . . . . . . .3-1 3.1 HOW SOFTWARE IS DEVELOPED . . . . . . . . . . . . . 3-2 a. Typical Life-Cycle Stages . . . . . .. . . . . 3-2 (1) Functional and Data Requirements. . . . . 3-3 (2) Design. . . . . . . . . . . . . . . . . . 3-6 (3)Build . . . . . . . . . . . . . . . . . . 3-7 (4) Test. . . . . . . . . . . . . . . . . . . 3-8 (5) Implement . . . . . . . . . . . . . . . .3-9 (6) Operate and Maintain. . . . . . . . . . .3-10 b. Tools Used in the Software Development Process . .. . . . . . . . . . . . . . . . .3-11 (1) Programming Languages . . . . . . . . . .3-11 (2) Automated SoftwareDevelopment. . . . . .3-14 (3) Prototyping . . . . . . . . . . . . . . .3-22 3.2 SOFTWARE LIFE-CYCLEMANAGEMENT METHODOLOGIES. . . .3-25 a. Methodology Defined . . . . . . . . . . . . . .3-25 b.What a Methodology Should Include . . . . . . .3-26 c. Benefits and Issues of Following a Life-CycleMethodology . . . . . . . . . . . . . . . . .3-28 d. Should the Agency Specify a Methodology?. . . .3-29 3.3QUALITY . . . . . . . . . . . . . . . . . . . . . .3-30 a. Two Measures of Quality . . . . . . . . . . . .3-30 b. TheImportance of Quality in Software Development Services Acquisitions . . . . . .3-32 c. PredictingQuality Before Award . . . . . . . .3-36 d. Ensuring Quality After Award. . . . . . . . . .3-37 3.4MEASURING PROGRESS. . . . . . . . . . . . . . . . .3-39 3.5 INVOLVEMENT OF MULTIPLECONTRACTORS DURING SOFTWARE DEVELOPMENT . . . . . . . . . . . . . . .3-41 3.6SIGNIFICANT CONCERNS IN DEVELOPING SOFTWARE . . . .3-44 a. Development of OpenSystems . . . . . . . . . .3-44 b. Designing-in Computer Security Throughout the Life-Cycle. . . . . . . . . . . .. . . . . .3-47 c. Software License Rights . . . . . . . . . . . .3-48 d. Data Administration Considerations. . . .. . .3-50 e. Emerging Trends in Software Development . . . .3-52 (1) Changes in the Economics ofSoftware Development. . . . . . . . . . . . . . .3-52 (2) Rightsizing . . . . . . . . . . . . . . .3-53 (3) IncrementalSoftware Development. . . . .3-54 (4) Interactive Design Techniques . . . . . .3-56 (5) Object-OrientedProgramming . . . . . . .3-59 (6) Reengineering . . . . . . . . . . . . . .3-60 (7) Reverse Engineering . . . . . . . .. . .3-61 4. OVERVIEW OF THE ACQUISITION LIFE-CYCLE. . . . . . . . . . 4-1 4.1 PLANNINGPHASE. . . . . . . . . . . . . . . . . . . 4-1 a. Identifying Automation Needs. . . . . . . . . . 4-1 b. Planning. . . . .. . . . . . . . . . . . . . . 4-2 c. Budgeting . . . . . . . . . . . . . . . . . . . 4-3 4.2 ACQUISITION PHASE . . . . . . . .. . . . . . . . . 4-3 a. Requirements Analysis . . . . . . . . . . . . . 4-3 b. Analysis of Alternatives. . . . . . . . . . . .4-4 c. Developing the Source Selection Plan (SSP). . . 4-5 d. Developing the Solicitation . . . . . . . . . .4-5 e. Issuing the Solicitation. . . . . . . . . . . . 4-6 f. Receiving Bids or Proposals . . . . . . . . . . 4-6 g.Source Selection. . . . . . . . . . . . . . . . 4-7 (1) Evaluating Bids or Proposals. . . . . . . 4-7 (2) Selecting theContractor. . . . . . . . . 4-8 (3) Awarding the Contract . . . . . . . . . . 4-8 4.3 IMPLEMENTATIONPHASE. . . . . . . . . . . . . . . . 4-8 4.4 ACTIVITIES AT THE END OF THE ACQUISITION LIFE-CYCLE. . . . . . . . . . . . . . . . . . . . . . . 4-9 5. REQUIREMENTS ANALYSIS . . . . . . . . . . . . . . . . . . 5-15.1 INTRODUCTION. . . . . . . . . . . . . . . . . . . . 5-1 5.2 KEY FACTORS FOR SUCCESSFULREQUIREMENTS ANALYSES. . 5-1 a. Identify, Agree on, and Document Assumptions andConstraints . . . . . . . . . . . . . . . 5-2 b. Devote an Appropriate Level of Effort to the RequirementsAnalysis . . . . . . . . . . . . 5-2 c. Reassess Requirements Periodically. . . . . . . 5-3 d. Distinguish BetweenMandatory Requirements and Optional Capabilities . . . . . . . . . . . . 5-4 5.3 ROLES ANDRESPONSIBILITIES. . . . . . . . . . . . . 5-5 a. Program Personnel . . . . . . . . . . . . . . . 5-5 b.IRM/Technical Personnel . . . . . . . . . . . . 5-6 c. Contracting Personnel . . . . . . . . . . . . . 5-6 5.4PRELIMINARY STEPS . . . . . . . . . . . . . . . . . 5-6 a. Determining Overall Information Needs . . . . . 5-7(1) Determining the Agency's Business Profile. . . . . . . . . . . . . . . . . 5-7 (2) Determining What theCurrent Systems Can Support. . . . . . . . . . . . . . . . . 5-8 (3) Defining Information Technology

Page 3: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Objectives . . . . . . . . . . . . . . . 5-9 b. Planning and Budgeting for Software Projects. . 5-9 (1)Identifying Software Projects . . . . . .5-10 (2) Preparing Plans . . . . . . . . . . . . .5-12 (3) Budgeting . . . . .. . . . . . . . . . .5-13 (4) Deciding Whether to Continue. . . . . . .5-13 5.5 TYPES OF REQUIREMENTSTO IDENTIFY . . . . . . . . .5-16 a. Software Functional and Performance Requirements. . . . . . . . . . . . .. . . .5-17 (1) Capabilities. . . . . . . . . . . . . . .5-17 (2) User Interface. . . . . . . . . . . . . .5-18 (3) Workload.. . . . . . . . . . . . . . . .5-19 (4) Speed . . . . . . . . . . . . . . . . . .5-20 (5) Growth. . . . . . . . . . . . . . . . . .5-20(6) Reliability . . . . . . . . . . . . . . .5-21 (7) Availability. . . . . . . . . . . . . . .5-21 (8) Accessibility byIndividuals with Disabilities . . . . . . . . . . . . . .5-21 (9) Control . . . . . . . . . . . . . . . . .5-24 (10) Securityand Privacy . . . . . . . . . .5-24 (11) Training . . . . . . . . . . . . . . . .5-25 (12) Conversion . . . . . . . . . . . . . ..5-26 (13) Documentation . . . . . . . . . . . . .5-26 (14) User and Operator Support. . . . . . . .5-26 (15)Standards. . . . . . . . . . . . . . . .5-27 b. Technical Environment Requirements . . . . . .5-28 (1) HardwareEnvironment. . . . . . . . . . .5-28 (2) System Software Environment . . . . . . .5-28 (3)Telecommunications Environment. . . . . .5-29 (4) Integration with Other Systems. . . . . .5-29 5.6DOCUMENTING THE REQUIREMENTS ANALYSIS . . . . . . .5-29 6. ANALYSIS OFALTERNATIVES. . . . . . . . . . . . . . . . . 6-1 6.1 OBJECTIVE . . . . . . . . . . . . . . . . . . . . . 6-1 a. Analysisof Technical Options . . . . . . . . . 6-3 b. Analysis of Acquisition Options . . . . . . . . 6-4 6.2 CHOOSINGBETWEEN ALREADY-AVAILABLE AND CUSTOM- DEVELOPED SOFTWARE . . . . . . . . . . . . .. . . 6-4 a. Mandatory-for-Use Programs. . . . . . . . . . . 6-5 b. Mandatory-for-Consideration Programs. . .. . . 6-7 c. Reengineering Existing Agency Software. . . . . 6-7 d. Using Another Agency's Software . . .. . . . . 6-9 e. Commercial Software . . . . . . . . . . . . . .6-10 f. Custom-Developed Software . . . . . . . . . ..6-11 6.3 KEY FACTORS FOR SUCCESSFUL ANALYSES OF ALTERNATIVES . . . . . . . . . . . . . . .. . . .6-11 a. Base the Analysis on the Agency's Requirements. . . . . . . . . . . . . . . . .6-12 b. Complete theAnalysis Before Deciding on the Solution. . . . . . . . . . . . . . . . . . .6-13 c. Identify, Agree on, andDocument Assumptions and Constraints . . . . . . . . . . . . . . .6-14 d. Use a Limited Set of Key Criteria toDistinguish among Options . . . . . . . . . .6-15 e. Do Not Eliminate Options by Dictating a SpecificApproach . . . . . . . . . . . . . .6-17 f. Recognize that Requirements and Alternatives Evolve Over Time. .. . . . . . . . . . . . .6-18 g. Test the Sensitivity of the Analysis to Changes in Assumptions and Constraints. . .6-21 6.4 ROLES AND RESPONSIBILITIES. . . . . . . . . . . . .6-22 a. IRM/Technical Personnel . . . .. . . . . . . .6-22 b. Program Personnel . . . . . . . . . . . . . . .6-23 c. Contracting Personnel . . . . . . . . . . . ..6-24 6.5 SURVEYING THE MARKET. . . . . . . . . . . . . . . .6-25 a. Sources of Information. . . . . . . . . .. . .6-27 (1) Trade Publications. . . . . . . . . . . .6-29 (2) Vendor Marketing Literature . . . . . . .6-29 (3)Other Publications. . . . . . . . . . . .6-30 (4) Trade Shows and Exhibitions . . . . . . .6-31 (5) User Groupsand Other Professional Associations . . . . . . . . . . . . . .6-32 (6) Other Government Users. . . . . . . . . .6-32 (7) Request For Information (RFI) . . . . . .6-33 b. What Information to Gather. . . . . . . . . . .6-34 (1)Technical Feasibility . . . . . . . . . .6-35 (2) Service Availability. . . . . . . . . . .6-35 (3) Degree ofCompetition Available . . . . .6-36 (4) Pricing Information . . . . . . . . . . .6-36 (5) Industry Support andContractual Practices. . . . . . . . . . . . . . . .6-37 6.6 ANALYSIS OF TECHNICAL OPTIONS . . . . . . . . .. .6-38 a. Performing Software Development Services On-site or Off-site . . . . . . . . . . . . .6-39 b.Limiting the Techniques and Tools Used. . . . .6-40 c. Limiting Unilateral Personnel Replacement . . .6-40 d. Deciding the Range of Software Development Services . . . . . . . . . . . . . . . . . .6-42 e. ChoosingAmong Technical Options. . . . . . . .6-44 (1) Identifying the Options That Meet All MandatoryRequirements . . . . . . . . .6-45 (2) Identifying the Key Discriminators. . . .6-45 (3) Comparisons . . . . . .. . . . . . . . .6-46 f. Documenting the Technical Choice. . . . . . . .6-48 6.7 ANALYSIS OFACQUISITION OPTIONS . . . . . . . . . .6-49 a. Noncontractual Options. . . . . . . . . . . . .6-49 (1) DoNothing. . . . . . . . . . . . . . . .6-49 (2) Reassign Software Development Services Resources within theAgency. . . . . . .6-50 (3) Reschedule Software Development Services . . . . . . . . . . . . . . . .6-50 (4) ShareSoftware Development Services with Another Agency . . . . . . . . . . . . .6-50 b. Contractual Options . . .. . . . . . . . . . .6-52 c. Choosing Among the Acquisition Options. . . . .6-53 d. Document the Analysis ofAcquisition Options. .6-53 6.8 DOCUMENTING THE ACQUISITION STRATEGY. . . . . . . .6-54 7.COMPETITION REQUIREMENTS. . . . . . . . . . . . . . . . . 7-1 7.1 CONTRACTING UNDER FULLAND OPEN COMPETITION . . . . 7-1 7.2 DEGREES OF COMPETITION. . . . . . . . . . . . . . . 7-2 a.Full and Open Competition . . . . . . . . . . . 7-2 b. Full and Open Competition After Exclusion ofSources . . . . . . . . . . . . . . . . . . . 7-3 c. Other Than Full and Open Competition. . . . . . 7-3 (1)Circumstances Permitting Other Than Full and Open Competition . . . . . . . . . . 7-4 (2) Justification for

Page 4: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Other Than Full and Open Competition . . . . . . . . . . . . 7-5 7.3 COMPETITION ADVOCATES . . . . . .. . . . . . . . . 7-6 8. REVIEW, APPROVAL, AND AUTHORIZATION PROCESS . . . . . . . 8-1 8.1THRESHOLD LEVELS. . . . . . . . . . . . . . . . . . 8-1 8.2 AGENCY-LEVEL REVIEW ANDAPPROVAL. . . . . . . . . . 8-2 8.3 GSA REVIEW AND AUTHORIZATION. . . . . . . . . . . . 8-5 9.GENERAL CONTRACTING CONSIDERATIONS. . . . . . . . . . . . 9-1 9.1 SOUND ACQUISITIONPLANNING AND EXECUTION. . . . . . 9-1 a. Match the Level of Effort to the Size and Scope of theAcquisition. . . . . . . . . . . 9-1 b. Maximize Competition. . . . . . . . . . . . . . 9-2 c. Develop ClearSpecifications and Work Statements. . . . . . . . . . . . . . . . . . 9-4 d. Avoid Providing Information toPotential Offerors Unless Authorized. . . . . . . . . . 9-7 e. Document and Follow the Source SelectionProcess . . . . . . . . . . . . . . . . . . . 9-9 f. Involve Each Discipline throughout the Acquisition Process . . . .. . . . . . . . .9-10 g. Actively Seek Parallel Reviews and Approvals. .9-11 h. Document the Bases forDecisions. . . . . . . .9-11 i. Assign a Team with a Level of Expertise Equal to the Size and Scope of theAcquisition. . .9-12 9.2 APPLICABLE REGULATIONS AND GUIDANCE . . . . . . . .9-13 9.3 ROLESAND RESPONSIBILITIES. . . . . . . . . . . . .9-14 a. Program Personnel . . . . . . . . . . . . . . .9-14 b.IRM/Technical Personnel . . . . . . . . . . . .9-15 c. Contracting Personnel . . . . . . . . . . . . .9-16 9.4IDENTIFYING REQUIREMENTS FOR THE SOFTWARE DEVELOPMENT CONTRACTOR . . . . .. . . . . . . . .9-17 a. Specific Development Services . . . . . . . . .9-17 b. Government and ContractorResponsibilities. . .9-18 c. Estimated Labor Hours by Personnel Category . .9-23 d. MinimumPersonnel and Corporate Qualifications. . . . . . . . . . . . . . . .9-23 e. Deliverable Items and DeliverySchedule . . . .9-25 f. Interim Deliverables and Reviews. . . . . . . .9-27 g. Performance Measures. . . . . .. . . . . . . .9-28 (1) Quantifiable Performance Measures . . . .9-28 (2) Qualitative Software-SpecificMeasures. .9-32 (3) Compliance with Documentation Standards .9-33 (4) Compliance with Statementof Work (SOW) .9-34 h. Review, Approval, and Acceptance Procedures . .9-34 i. Financial Measures. .. . . . . . . . . . . . .9-36 (1) Incentives. . . . . . . . . . . . . . . .9-36 (2) Liquidated Damages. . . . . . . . . . . .9-36(3) Deduction Schedules . . . . . . . . . . .9-37 j. Status Reports and Progress Briefings . . . . .9-37 k.Standards . . . . . . . . . . . . . . . . . . .9-39 l. Development Methodology . . . . . . . . . . . .9-40 m. TechnicalDevelopment Environment . . . . . . .9-41 n. Software Maintenance. . . . . . . . . . . . . .9-41 o.Configuration Management. . . . . . . . . . . .9-42 p. Site Security . . . . . . . . . . . . . . . . .9-43 q. PersonnelSecurity. . . . . . . . . . . . . . .9-44 r. Records Management. . . . . . . . . . . . . . .9-44 s. Budget. . . . . . . . . . . .. . . . . . . . .9-45 t. Contract Life . . . . . . . . . . . . . . . . .9-46 9.5 SELECTING A CONTRACT TYPE . . . .. . . . . . . . .9-46 a. Contract Types. . . . . . . . . . . . . . . . .9-47 (1) Fixed-Price Contracts . . . . . . . . . .9-49(2) Cost-Reimbursement Contracts. . . . . . .9-49 (3) T∧M Contracts . . . . . . . . . . . . . .9-51 (4)Incentive Contracts . . . . . . . . . . .9-52 b. Issuing Agencywide Indefinite Delivery Contracts . . . . . . . . .. . . . . . . . .9-56 10. CONTRACTING BY NEGOTIATION . . . . . . . . . . . . . . .10-1 10.1 ATTRIBUTESOF NEGOTIATED CONTRACTS . . . . . . . .10-2 a. Differences Between Negotiation and OtherContracting Methods . . . . . . . . . . . . .10-2 b. Contract Attributes . . . . . . . . . . . . . .10-4 10.2 THESOURCE SELECTION PLAN (SSP). . . . . . . . . .10-4 a. Source Selection Team Organization. . . . . ..10-5 b. Determining the Evaluation Factors. . . . . . .10-8 (1) Price or Cost . . . . . . . . . . . . . .10-8 (2)Technical or Quality. . . . . . . . . . .10-9 c. Source Selection Approaches . . . . . . . . . 10-18 10.3 STEPSIN CONTRACTING BY NEGOTIATION. . . . . . . 10-21 a. Prepare and Issue the Solicitation. . . . . .10-21 (1) Write the Solicitation. . . . . . . . . 10-21 (2) Develop an Independent Government CostEstimate (IGCE). . . . . . . . . . . . 10-27 (3) Obtain Industry Comments on the Draft Solicitation(Optional). . . . . . . . 10-28 (4) Hold a Presolicitation Conference (Optional) . . . . . . . . . . . . . . 10-29 (5)Develop Source Selection Materials. . . 10-29 (6) Publicize the Solicitation in the CBD . 10-32 (7)Issue the Solicitation. . . . . . . . . 10-32 (8) Hold a Preproposal Conference (Optional) . . . . . . . . . . . . . .10-32 (9) Answer Questions and Amend the Solicitation . . . . . . . . . . . . . 10-33 b. Evaluate Proposals. .. . . . . . . . . . . . 10-34 (1) Train the SST . . . . . . . . . . . . . 10-35 (2) Receive Proposals . . . . . . . . . . . 10-36 (3) Determine Whether Proposals Comply with Solicitation Instructions. . . . . . . 10-36 (4) EvaluateProposals Against Minimum Technical Requirements . . . . . . . . 10-37 (5) Request Clarification . . . . . .. . . 10-37 (6) Rate Technical Proposals. . . . . . . . 10-38 (7) Conduct Initial Total Cost Evaluation . 10-39 (8) Establish the Competitive Range . . . . 10-39 c. Selecting a Contractor. . . . . . . . . . . . 10-40 (1)Conduct Discussions and Negotiations. . 10-40 (2) Request BAFOs . . . . . . . . . . . . . 10-41 (3) EvaluateBAFOs. . . . . . . . . . . . . 10-42 (4) Recommend the Apparent Winner . . . . . 10-43 (5) ConductResponsibility and Legal Reviews. . . . . . . . . . . . . . . . 10-43 (6) Select the Prospective Contractor . . .

Page 5: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

10-45 (7) Notify Unsuccessful Offerors. . . . . . 10-46 (8) Debrief Unsuccessful Offerors . . . . . 10-4611. OTHER ACQUISITION METHODS. . . . . . . . . . . . . . . .11-1 11.1 ACQUISITION BY SMALLPURCHASE. . . . . . . . . . .11-1 a. Purpose of Small Purchase Procedures. . . . . .11-2 b. Using SmallBusinesses. . . . . . . . . . . . .11-3 c. Competition . . . . . . . . . . . . . . . . . .11-4 (1) Purchases of $2,500 orless . . . . . . .11-4 (2) Purchases between $2,500 and $25,000. . .11-4 11.2 ACQUISITION FROMESTABLISHED SOURCES OF SUPPLY . .11-5 a. Benefits of Using Established Sources of Supply. .. . . . . . . . . . . . . . . . . .11-5 b. Available Sources . . . . . . . . . . . . . . .11-6 (1) Agencywide IndefiniteDelivery Contracts. . . . . . . . . . . . . . . .11-6 (2) GSA Contracts . . . . . . . . . . . . . .11-7 11.3ACQUISITION BY SEALED BIDDING. . . . . . . . . . .11-8 12. CONTRACT ADMINISTRATION. . . .. . . . . . . . . . . . .12-1 12.1 ROLES AND RESPONSIBILITIES . . . . . . . . . . . .12-1 a. ContractingPersonnel . . . . . . . . . . . . .12-1 b. IRM/Technical Personnel . . . . . . . . . . . .12-5 c. Program Personnel .. . . . . . . . . . . . . .12-5 12.2 PREPARING FOR THE CONTRACTOR . . . . . . . . . . .12-6 a. Obtainingand Preparing Space, GFE, and GFI . .12-6 (1) Space . . . . . . . . . . . . . . . . . .12-6 (2) GovernmentFurnished Equipment. . . . . .12-8 (3) Government Furnished Information. . . . .12-9 b. PreparingAgency Personnel to Work with the Contractor. . . . . . . . . . . . . . . . . .12-9 c. Reviewing theContractor's Project Plan . . . 12-10 d. Developing a Tracking and Reporting System. . 12-12 12.3POST-AWARD CONFERENCE. . . . . . . . . . . . . . 12-13 12.4 CONTRACT MONITORING. . . . . . . . .. . . . . . 12-14 a. Communicating with the Contractor . . . . . . 12-16 b. Enforcing Key Personnel Clauses. . . . . . . 12-17 c. Obtaining Performance . . . . . . . . . . . . 12-18 (1) Technical Performance . . . . . . . . .12-19 (2) Cost Performance. . . . . . . . . . . . 12-19 (3) Schedule Performance . . . . . . . . . 12-20 d.Determining Awards and Incentives . . . . . . 12-21 e. Exercising Contract Options . . . . . . . . . 12-23 f.Contract Modifications. . . . . . . . . . . . 12-24 (1) Administrative Changes. . . . . . . . . 12-25 (2)Requirements Changes. . . . . . . . . . 12-25 g. Warranties. . . . . . . . . . . . . . . . . . 12-25 h. QualityAssurance (QA) Deduction Schedules. . 12-27 i. Claims. . . . . . . . . . . . . . . . . . . . 12-28 j. ContractTerminations . . . . . . . . . . . . 12-28 (1) Reasons for Termination . . . . . . . . 12-29 (2) Procedures forTermination. . . . . . . 12-30 k. Contract Closeout . . . . . . . . . . . . . . 12-32 l. Contract Reviews andAudits . . . . . . . . . 12-33 m. Importance of the COTR in Software Contract Monitoring. . . . . . . . . . . . .. . . . 12-34 13. SOFTWARE DEVELOPMENT . . . . . . . . . . . . . . . . . .13-1 13.1 FUNCTIONAL ANDDATA REQUIREMENTS ANALYSIS. . . . .13-1 a. Determine Detailed Functional Requirements. ..13-2 b. Determine Detailed Data Requirements. . . . . .13-4 13.2 DESIGN . . . . . . . . . . . . . . . . . . . . ..13-6 a. Preliminary Design. . . . . . . . . . . . . . .13-6 b. Detailed Design . . . . . . . . . . . . . . . .13-8 c.Design Review . . . . . . . . . . . . . . . . 13-11 13.3 BUILD. . . . . . . . . . . . . . . . . . . . . . 13-13 13.4TESTING. . . . . . . . . . . . . . . . . . . . . 13-16 a. Ensure Thorough Testing . . . . . . . . . . . 13-17 b. Levelsof Testing . . . . . . . . . . . . . . 13-17 (1) Unit. . . . . . . . . . . . . . . . . . 13-18 (2) Integration . . . . . . . . . . . . .. 13-18 (3) System. . . . . . . . . . . . . . . . . 13-18 13.5 READINESS REVIEW . . . . . . . . . . . . . . . . 13-1914. SOFTWARE TESTING AND ACCEPTANCE. . . . . . . . . . . . .14-1 14.1 SOFTWARE QUALITYASSURANCE (QA). . . . . . . . . .14-1 a. Development Contractor. . . . . . . . . . . . .14-3 b. Agency. . . . . .. . . . . . . . . . . . . . .14-4 c. Support Contractor. . . . . . . . . . . . . . .14-5 14.2 DETERMINING TESTOBJECTIVES. . . . . . . . . . . .14-6 14.3 DECIDING WHAT TO TEST. . . . . . . . . . . . . . .14-7 14.4TYPES OF TESTING . . . . . . . . . . . . . . . . 14-10 a. Design Testing. . . . . . . . . . . . . . . . 14-10 b.Software Testing. . . . . . . . . . . . . . . 14-11 14.5 CHOOSING AMONG TEST METHODS. . . . . . . . . . .14-12 a. Benchmark Testing . . . . . . . . . . . . . . 14-13 b. Black Box Testing . . . . . . . . . . . . . . 14-13 c.White Box Testing . . . . . . . . . . . . . . 14-14 d. Testing with Fabricated Data. . . . . . . . . 14-14 e. Testingwith Real Data. . . . . . . . . . . . 14-15 14.6 DETERMINING RESOURCES FOR TESTING. . . . . . . . 14-16 14.7 ACCEPTING THE SOFTWARE AFTER SUCCESSFUL TESTING. 14-16 15. SOFTWAREIMPLEMENTATION. . . . . . . . . . . . . . . . .15-1 15.1 IMPLEMENTATION . . . . . . . . . . . . . . . . . .15-1a. Implementation Steps. . . . . . . . . . . . . .15-2 (1) Pre-installation. . . . . . . . . . . . .15-3 (2) Installation. .. . . . . . . . . . . . .15-5 (3) Post-installation . . . . . . . . . . . .15-6 b. Implementation Considerations . . . . . .. . .15-6 (1) Cutover Approach. . . . . . . . . . . . .15-8 (2) Site Preparation. . . . . . . . . . . . 15-11 (3) DataConversion . . . . . . . . . . . . 15-12 (4) Interface Programs. . . . . . . . . . . 15-13 (5) Procedural Changes. . .. . . . . . . . 15-14 (6) Documentation . . . . . . . . . . . . . 15-14 (7) Preparing the Users . . . . . . . . . . 15-15(8) User Training . . . . . . . . . . . . . 15-16 (9) Back-up Procedures. . . . . . . . . . . 15-16 (10) Security . . . .. . . . . . . . . . . 15-17 16. SOFTWARE OPERATION AND MAINTENANCE . . . . . . . . . . .16-1 16.1SOFTWARE OPERATION AND SUPPORT . . . . . . . . . .16-1 a. Performance Monitoring. . . . . . . . . .

Page 6: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

. . .16-2 b. Continued Configuration Management. . . . . . .16-4 16.2 SOFTWARE MAINTENANCE . .

. . . . . . . . . . . . .16-6 16.3 SOFTWARE MODIFICATION/ENHANCEMENT. . . . . . . . .16-8APPENDIX A - GLOSSARY AND ACRONYMS . . . . . . . . . . . . . A-1 APPENDIX B -LEGISLATIVE AND REGULATORY ENVIRONMENT. . . . . B-1 APPENDIX C -BIBLIOGRAPHY. . . . . . . . . . . . . . . . . . C-1 APPENDIX D - GSA ASSISTANCE PROGRAMS . . . . .. . . . . . . D-1 APPENDIX E - CONTACTS FOR MORE INFORMATION . . . . . . . . . E-1 APPENDIXF - OFFICE OF FEDERAL PROCUREMENT POLICY (OFPP) POLICY LETTER 91-2 . . . . . . . . . . .. . . . . . . . . . . F-1

LIST OF EXHIBITSPage 3-1 Differences Between Source and Object Code. . . . . . .3-12 4-1 The Acquisition Life-Cycle. . .. . . . . . . . . . . . 4-2 6-1 Overview of the Analysis of Alternatives Process. . . . 6-2 6-2 Key Factors forSuccessful Analyses of Alternatives . .6-12 6-3 The Analysis Process. . . . . . . . . . . . . . . . . .6-19 6-4Examples of Information Sources for the Software Development Services Market Survey. . . . . . . . . ..6-28 6-5 Examples of Costs and Benefits by Type. . . . . . . . .6-47 8-1 Typical Agency-Level Reviews.. . . . . . . . . . . . . 8-3 10-1 Acquisition Steps . . . . . . . . . . . . . . . . . . 10-22 10-2 Uniform ContractFormat . . . . . . . . . . . . . . . 10-24 12-1 Division of Contract Administration Tasks . . . . . . .12-2 12-2Typical Contract Administration Responsibilities. . . .12-4 12-3 Sample Agenda Items . . . . . . . . . . . . .. . . . 12-15 15-1 Sample Activities of Software Implementation. . . . . .15-7 15-2 Advantages andDisadvantages of Cutover Approaches. . .15-9 A GUIDE FOR ACQUIRING SOFTWAREDEVELOPMENT SERVICES

CHAPTER 1: INTRODUCTION

1.1 THE ACQUISITION GUIDE SERIESThe General Services Administration (GSA) initiated the Acquisition Guide series to help promoteeffective and efficient acquisition of Federal information processing (FIP) resources. The PolicyAnalysis Division (KMP) of the Office of Information Resources Management Policy (KM) developsand issues the guides.

a. BackgroundThe acquisition guides resulted from a survey of Federal agencies by GSA to determine the types ofguidance that agencies would find helpful when acquiring FIP resources. In addition to guidance onapproaches and techniques, the guides also address laws, regulations, directives, and policies that affectFederal acquisitions.

b. AudiencesGSA developed the guides for Government professionals who are new to, unfamiliar with, or need torefresh their knowledge of the process for acquiring FIP resources. While other groups may also findthem useful, the guides are directed to the three staffs most involved in the acquisition of FIPresources: o Program personnel (users supported directly by the FIP resources being acquired) -- Forfunctional expertise o Information Resource Management (IRM)/Technical personnel -- For technicalexpertise on FIP resources o Contracting personnel -- For contracting expertise In discussing eachphase of the acquisition life-cycle, the guides outline the roles and responsibilities for each of the threestaff disciplines.

c. ObjectivesThe guides are intended to provide basic advisory and factual information to acquisition team memberswho are relatively new to or want to reinforce their knowledge of the process. They are not intended tobe encyclopedias of acquisition information. Where applicable, they refer to the most commonly

Page 7: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

applied laws, regulations, and guidance and then direct the reader to other sources for more completeinformation.

1.2 SCOPE OF THE SOFTWARE DEVELOPMENT SERVICESGUIDEThis Guide, entitled A Guide for Acquiring Software Development Services, includes all phases of theacquisition life-cycle, from the first consideration of software as part of the solution to an agency'sinformation need to contract closeout. It addresses acquisitions that focus on software developmentservices. A Guide for Acquiring Systems Integration Services, another guide in this series, coversacquisitions that include other products and services (e.g., hardware, maintenance services) as well assoftware development services. While this Guide does not describe every activity at every step of theacquisition process, it does provide an overview of these steps and the key success factors forcompleting them and the overall acquisition successfully.

a. The Complete Acquisition Life-CycleThis Guide covers four phases of the acquisition life-cycle process: o Planning -- This phase consists ofthe early steps of an acquisition, including identifying the agency's information needs, identifying theneeds that can be met through software, and budgeting resources for it. o Acquisition -- This phaseconsists of activities necessary for acquiring software development services, including developing thesolicitation. These activities include the requirements analysis, the analysis of alternatives, and otherdocumentation to support the acquisition strategy. The analysis of alternatives examines the choicebetween software already available (e.g., commercial) and custom-developed. It also examines bothtechnical options (i.e., how the software development services will be performed) and acquisitionoptions (i.e., what source will provide the services, including both contractual and noncontractual).Finally, this phase usually includes the solicitation of proposals or bids, their evaluation, and award ofa contract. o Implementation -- This phase consists of tasks from contract award to the point at whichthe contractor has completed the software development services required under the contract. o End ofLife-Cycle -- This phase consists of all the tasks for closing out the contract and making final paymentto the contractor. Chapter 4 provides additional information about the acquisition life-cycle, includingtypical tasks for each phase.

b. LimitationsThis Guide is not a complete source for information about software development services acquisitions.Agencies' requirements are so varied that one guide could not adequately address every situation. Inaddition, the restrictions that apply to Federal agencies are subject to change. Rather than duplicate thesources that articulate these restrictions, including the Federal Information Resources ManagementRegulation (FIRMR), the Federal Acquisition Regulation (FAR), the Federal Property ManagementRegulation (FPMR), Federal Information Processing Standards Publications (FIPS PUBS), Office ofManagement and Budget (OMB) Circulars and Bulletins, and agency-specific directives, this Guidedirects the reader to the appropriate sources for current information.

1.3 ORGANIZATION OF THE SOFTWARE DEVELOPMENTSERVICES GUIDE

a. General InformationIn Chapter 2, the Guide provides the overall context for software development services acquisitions,including the scope of software development services, key terms, roles of each staff discipline, and thekey factors for successful acquisitions. Chapter 3 describes special characteristics of softwaredevelopment services.

Page 8: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Life-Cycle FocusIn addition to describing special characteristics of software development services, Chapter 3 providesan overview of the software development life-cycle, as distinguished from the acquisition life-cycle.Chapter 4 provides an overview of the acquisition life-cycle. Chapters 5 through 12 proceedsequentially through the activities to acquire a software development services contractor. Chapter 5describes the requirements analysis, which defines the agency's software needs in functional andtechnical terms. Chapter 6 covers the analysis of alternatives, which examines both technical (what toacquire) and acquisition (how to acquire) options. Chapters 7, 8, and 9 provide general guidance foractivities important to acquiring software development services by contract. Chapter 9 also describeshow to define the requirements for the development process that the contractor must follow. Chapter10 covers the steps of contracting by negotiation. Chapter 11 addresses other acquisition methods.Finally, Chapter 12 describes contract administration and implementation activities. Chapters 13through 16 describe the process of developing, implementing, and maintaining the new software. Forconvenient reference, Appendix A provides a glossary of commonly used terms and acronyms.Appendix B describes the regulatory environment for acquisitions. Appendix C is a bibliography ofother publications related to software development services. Appendix D describes GSA assistanceprograms that may be useful in software development services acquisitions, and Appendix E provides alist of contacts for more information. Appendix F is a policy letter about quality in contracting forservices from the Office of Federal Procurement Policy (OFPP). A GUIDE FOR ACQUIRINGSOFTWARE DEVELOPMENT SERVICES

CHAPTER 2: GENERAL INFORMATIONThis chapter provides the following important general information about software developmentservices acquisitions: o Definition of key terms used in the Guide o Scope of software developmentservices acquisitions o Roles and responsibilities of Government personnel who participate in theacquisition process o Key actions that will help make software development services acquisitionssuccessful o Protests and ways to avoid them

2.1 DEFINITION OF KEY ACQUISITION TERMSThe Federal Acquisition Regulation (FAR) and the Federal Information Resources ManagementRegulation (FIRMR) define most terms used in an acquisition of Federal information processing (FIP)resources. Definitions of the following key terms are included to provide maximum use of this Guide.

a. Federal Information Processing ResourcesThe FIRMR defines FIP resources as automatic data processing equipment (ADPE) and any equipmentor interconnected system or subsystems of equipment that is used in the automatic acquisition, storage,manipulation, management, movement, control, display, switching, interchange, transmission, orreception of data or information-- (1) by a Federal agency, or (2) under a contract with a Federalagency which-- (a) requires the use of such equipment, or (b) requires the performance of a service orthe furnishing of a product which is performed or produced making significant use of such equipment.Such term includes computers; ancillary equipment; software, firmware, and similar procedures;services, including support services; and related resources as defined by regulations issued by theAdministrator of General Services.

b. AcquisitionAccording to the FAR, acquisition is: [T]he acquiring by contract with appropriated funds of suppliesor services (including construction) by and for the use of the Federal Government through purchase orlease, whether the supplies or services are already in existence or must be created, developed,demonstrated, and evaluated. Acquisition begins at the point when agency needs are established andincludes the description of requirements to satisfy agency needs, solicitation and selection of sources,award of contracts, contract financing, contract performance, contract administration, and those

Page 9: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

technical and management functions directly related to the process of fulfilling agency needs bycontract.

c. Requirements AnalysisA requirements analysis establishes the agency's need for software development support services. Itincludes the early stages of determining information needs based on the agency's mission andactivities, identifying how software can help achieve long-term automation objectives and short-termautomation needs, and considering budgeting needs for the software. The initial requirements analysisalso forms the basis for later development of detailed objectives and requirements for the software.

d. Analysis of AlternativesWhile the requirements analysis establishes the agency's need for software, the analysis of alternativesestablishes how they will be met. This includes choosing between already-existing software andcustom-developed software. This analysis helps the agency choose the most advantageous alternative(consisting of technical and acquisition options) for its need. The analysis also examines bothnoncontractual (e.g., sharing with another agency) and contractual choices.

e. Acquisition StrategyThe end result of the analysis of alternatives is an acquisition strategy that defines, in detail, themechanisms and methodology to be used to acquire software development services. It includesdecisions about the degree of competition, contract type (including pricing structure), and deliveryschedule.

2.2 WHAT ARE SOFTWARE DEVELOPMENT SERVICES?

a. SoftwareAccording to the FIRMR, FIP software means any software, including firmware, specifically designedto make use of and extend the capabilities of FIP equipment. Software was invented in the early daysof computers, when equipment was designed to solve specific numerical problems (e.g., artilleryballistics). The equipment could not easily be adapted to solve other problems; therefore, engineersconceived separate categories of "hardware" and "software," so computers could support a wider rangeof problems. They built general capabilities (e.g., data storage, basic calculation) into the physicalhardware and left the solution of specific problems to software, which told the hardware in what orderto apply the general capabilities. Thus, software is the set of instructions that tells the hardware what todo. A series of instructions that performs a particular task is called a "program." Software stored innon-volatile memory (i.e., memory that is not erased when the computer is turned off) is called"firmware." Software falls into two broad categories: o System Software -- This directs the operation ofthe hardware on which the software runs. System software also supports the development andoperation of applications software (see below). The basic classes of systems software include-- ooOperating environment -- Operating system, security, job scheduling, transaction processing, andperformance monitoring software oo Applications development -- Programming languages, databasemanagement systems, text editing, computer-aided software engineering (CASE), debugging tools, andtest data generators oo Utility -- Code conversion, copying, disk management, tape management,backup, archiving, recovery, printing, chargeback and billing, and configuration management oApplication Software -- This provides the capabilities to meet the software users' computing needs andsupport their job activities. The basic classes of applications software include-- oo Office Tools --Word processing, spreadsheets, electronic mail, graphics, and desktop publishing oo Functional --Software that supports agency missions and activities, including financial management, projectmanagement, inventory control, etc. This Guide distinguishes between software that already exists andsoftware that is custom developed to meet the agency's information needs. The Guide coversacquisition of services to develop functional application software to solve a particular information need

Page 10: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

that cannot be met with existing software. (1) Already-existing Software Software may already existthat can meet the agency's needs. Software packages and tools sold commercially are referred to ascommercial software. The FIRMR defines commercial software as software that is available throughlease or purchase in the commercial market from a concern representing itself to have ownership ofmarketing rights in the software. This includes software furnished as part of the FIP system, butseparately priced. An operating system would be an example of software that falls into this category.The other category of existing software covers software packages that, while not commercial, weredeveloped by an agency to solve a similar need. Other agencies may be able to utilize this existingsoftware for their own purposes with little or no customization. The acquisition of already-existingsoftware, commercial or otherwise, is outside the scope of this acquisition guide. A Guide forAcquiring Commercial Software, another guide in this series, describes the acquisition of commercialsoftware in detail. (2) Custom-developed Software In certain circumstances, acquiring existingsoftware might not meet the needs of or be most advantageous to an agency. In this case, the agencywould develop new application software. The decision to use existing software or develop newsoftware should be based on two factors: (1) the ability of existing software to meet the agency'srequirements and (2) the risk tradeoff between customizing existing software and developing newsoftware. A. Ability to Meet Agency Requirements To determine whether existing software can meetits requirements, the agency must first define the functional and performance capabilities the softwaremust have (e.g., the ability to consolidate X million records of size Y during Z hours). The agencymust then determine if software packages available in the marketplace or from other agencies can meetits requirements. The software must not only provide all the capabilities that the agency defined, butmust also operate in the agency's FIP resources environment (i.e., with the existing or plannedhardware and systems software). If commercial software or another agency's software meets theagency's requirements at a reasonable price, it should acquire the existing software rather than developnew software. In some cases, the agency may find it advantageous to modernize its FIP environment(hardware and software) to take advantage of existing software. Even if no specific software packagemeets all the agency's requirements, it may be possible to customize one so that it meets therequirements. If customization is required, the agency should determine if the degree of modification iswithin acceptable limits. B. Risks of Customizing Existing Software Before deciding to customizeexisting software, the agency should assess the risks involved. When a significant portion of anexisting software package's capabilities is altered, many of its benefits may be lost or reduced. Suchbenefits include: o Ease of training, maintenance, and support o Using proven software with anestablished track record o Compatibility and integration with other existing software o Ability to usestandard documentation and training materials o The software developer's warranty against defectsAnother risk of customization is that the new code may affect the software's performance or introduceerrors into the existing code. In addition, later versions of the software may be incompatible with thecustomized code or require a significant rewrite of it. The agency should weigh these risks and theircosts against the risks and costs of customizing existing software to determine which would be saferand cheaper.

b. Scope of Software Development Services Acquisitions(1) Specific Services That Could be Included in a Software Development Services Acquisition Tomake full use of new software, the agency may also require related services, such as training,conversion, software maintenance, and user support. Many software developers offer these additionalservices. If the agency can obtain them from the same source as the software development services,they could be covered by a single software development services acquisition. (2) SoftwareDevelopment Services Acquisitions vs. Systems Integration Services Acquisitions Systems integration,a widely-used term in both Government and industry, refers to the acquisition of large, complexinformation systems with integrated support. As discussed in A Guide for Acquiring SystemsIntegration Services, another guide in this series, a systems integration services (SIS) acquisition hasthe following key elements: o It is a complete information solution o It includes several types of FIPresources o A single vendor is responsible for the project Because SIS acquisitions are intended tosolve information problems, they frequently include custom-developed software as a component.

Page 11: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

However, they also include other components (e.g., hardware). SIS acquisitions are outside the scopeof this guide. However, this guide supplements A Guide for Acquiring Systems Integration Services inthe detailed guidance it provides about the software development component of an SIS acquisition.

c. Relation of Software Development Services to Other FIP ResourcesSoftware development services are one type of FIP support service. FIP support services are defined inFIRMR Part 201-4 as "any commercial nonpersonal services, including FIP maintenance, used insupport of FIP equipment, software, or services." FIRMR Bulletin A-1 lists several examples of FIPsupport services, including custom software development.

d. Relation of This Guide to Others in the Acquisition Guide SeriesIn addition to the commercial software and SIS guides mentioned above, three other guides in thisseries provide agencies with useful information about acquiring software development services: o AGuide to Acquiring FIP Support Services covers FIP support services as a group. o A Guide toRequirements Analysis and Analysis of Alternatives describes policies and procedures for carrying outthese two steps of the FIP resources acquisition process. o A Guide to Performance and CapabilityValidation (P∧CV), describes P∧CVs and their use in acquiring FIP resources, includingsoftware.

2.3 ROLES AND RESPONSIBILITIESA successful software development services acquisition requires involvement of program, InformationResource Management (IRM)/technical, and contracting staff. Some responsibilities are specified byregulation. Others, however, may vary from acquisition to acquisition and from agency to agency. Forthese, the nature and degree of involvement varies with the complexity of the acquisition and the stageof the acquisition life-cycle. A senior official in the agency should designate a team with responsibilityfor each large, complex software development services acquisition. The team should consist ofrepresentatives from each staff discipline (program, IRM/technical, contracting). The team isresponsible for ensuring that the acquisition-- o Successfully meets the agency's needs, o Satisfies alllegal and regulatory requirements, and o Remains on schedule and within budget. Teamwork is animportant factor in successful software development services acquisitions. Although team membershave specific assignments, success requires both cooperation and consultation among the three staffdisciplines. By conducting their assignments in parallel (e.g., reviewing solicitations) rather thansequentially, they can significantly shorten the acquisition lead time. Most large, complex softwaredevelopment services acquisitions involve six staff roles: End User, Program Manager (PM),IRM/Technical Representative, Contracting Officer (CO), Contracting Officer's Representative (COR),and Contracting Officer's Technical Representative (COTR). Representatives from the three staffdisciplines take these roles.

a. End UserThe end users will interact with and use the software on a day-to- day basis. The ultimate success of theacquisition will be judged by how well the software supports the end users' needs, including bothsystem capabilities and ease of use. End users are typically program (i.e., line) staff, although theycould be IRM/technical staff for particular applications (e.g., specialized data manipulation utilities).End users should be involved throughout the life-cycle of each major acquisition of softwaredevelopment services, as well as during the development of the actual software. Their input isparticularly valuable during the initial requirements analysis phase. They should remain involved, byreceiving status updates and serving as advisors, whenever a long time lag develops between therequirements analysis and the contract award.

Page 12: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Program ManagerThe agency should designate a PM for all software development services acquisitions. The PMrepresents the end users throughout the acquisition and is responsible for ensuring that the softwaremeets the agency's long- and short-term needs. Initially, the PM may participate in strategic and tacticalplanning that leads to the acquisition. The PM will typically complete the analyses and documents(e.g., analysis of alternatives) required by the FIRMR. The PM may also help prepare some of thesupporting acquisition documents (e.g., program-related portions of the solicitation).

c. Information Resources Management/Technical RepresentativeThe IRM/technical organization provides technical expertise to the PM and the CO throughout theacquisition process. One of their key functions is translating end users' requirements for automatedsupport of particular business functions into specific types of FIP resources, such as softwaredevelopment services. As acquisition requirements dictate, the IRM/technical staff may be called uponto assist in: o Defining requirements for software functions, the technical environment in which thesoftware will run, and the process by which it will be developed o Conducting market research oWriting statements of work, specifications, evaluation factors and criteria, and technical material forthe solicitation o Performing the technical evaluation of proposals

d. Contracting OfficerFAR Part 1 places authority and responsibility with agency heads to contract for authorized suppliesand services. Agency heads, in turn, delegate to COs the authority to enter into, administer, andterminate contracts. Agency heads (or their designees) issue warrants to COs stating the limits of theirauthority. However, only GSA can provide agencies with the authority to contract for FIP resources. Adelegation of procurement authority (DPA) to the agency's Designated Senior Official (DSO) providesthis authority. GSA provides DPAs to DSOs in three ways: o A regulatory DPA with thresholds up tothose set forth in the FIRMR o A specific agency DPA with thresholds higher or lower than thoseprovided by the FIRMR o A specific acquisition DPA in response to an Agency Procurement Request(APR) Although the agency's DSO may redelegate GSA's exclusive authority for FIP resources toqualified officials, only a CO may enter into and sign a contract on behalf of the Government;therefore, GSA's procurement authority for FIP resources must be redelegated to the CO.

e. Contracting Officer's Technical RepresentativeA CO usually administers several contracts at the same time and generally designates a COTR tohandle specific contract administration functions. The COTR usually has technical expertise and maycome from either the program or IRM/technical organization. Typically, the COTR serves as atechnical liaison between the Government and the contractor. Duties assigned may include monitoringthe contractor's technical, schedule, and cost performance against the terms of the contract, approvinginvoices, and accepting deliverables. The contract must identify the COTR, who is limited to theactivities specifically authorized in the contract and identified in writing by the CO. The COTR doesnot have authorization to change (add, delete, or modify) contract terms, conditions, or requirements,or to take any action that might give this appearance. The CO alone has the authority to make changes.Any changes must be confirmed in writing. A very large, complex contract for software developmentservices may require multiple COTRs. In this case, the CO may appoint a COR as an interface betweenthe COTRs and the CO. The COR is frequently a contract specialist, but may also come from theprogram or IRM/technical organization. Some agencies use the terms COR and COTRinterchangeably.

2.4 KEYS TO SUCCESSFUL ACQUISITIONSSuccessful acquisitions have several basic characteristics. By following the advice below, agencies canavoid the majority of situations that hamper the acquisition of software development services.

Page 13: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

a. Define Requirements FunctionallyAvoid deciding in advance on a specific solution to the agency's software development needs. With thetremendous growth in Government computing over the last decade, program personnel have becomesophisticated and knowledgeable about information products and services and about the detailedaspects of systems. Every office develops its own computer "experts" who have their own favorite userinterfaces, database management systems (DBMS), programming languages, CASE tools, anddevelopment methodologies. Sometimes users will borrow brand names adopted by particular vendorsand inappropriately apply them to whole classes of products or services. These factors can leadagencies to think of their software development needs in terms of a particular interface, DBMS, CASEtool, or methodology. Avoid this pitfall, especially for large or complex acquisitions, for the followingreasons: o Using brand names, even unintentionally, precludes full and open competition, which ismandated by statute. o An in-house "expert's" particular favorite may not be well suited for lesssophisticated users in the organization. o In some cases, real or perceived bias towards a particularvendor's products or services has resulted in contract protests, adverse publicity, and Congressionalinvestigations. Instead of naming a specific interface, DBMS, or other piece of system software, focuson defining the agency's underlying functional needs þ the capabilities that the new software shouldprovide. Chapter 5, Requirements Analysis, describes this functional approach to defining softwarerequirements in detail. Similarly, avoid naming a specific CASE tool or proprietary developmentmethodology, but instead define the agency's needs for quality and efficiency in the developmentprocess. Chapter 9, General Contracting Considerations, describes the functional approach to thesoftware development process. Occasionally, it may be appropriate to specify a product, methodology,or tool by name; e.g., when the agency has already established an organizational standard by carefulstudy and competition. However, the agency must justify such action using the procedures discussed inChapter 7, Competition Requirements.

b. Obtain Full and Open CompetitionAs mandated by statute, COs must promote and provide for full and open competition when solicitingoffers and awarding contracts. A competitive market serves the best interests of the agencies, thetaxpayers, and the vendor community. Agencies' interests are best served because full and opencompetition provides them with the widest possible offerings to meet their information needs. Thetaxpayers' interests are best served because full and open competition encourages vendors to offer theirbest prices. Vendors' interests are best served because their access to Government business ismaximized. Although the contracting staff is usually sensitive to the need for full and opencompetition, program and IRM/technical staffs sometimes overlook its benefits. Therefore, theacquisition team should periodically take time for a deliberate, objective review of the requirements,alternatives, solicitations, source selection plan, and other materials to ensure that they do notunnecessarily limit competition. Although full and open competition is generally in the agency's bestinterests, occasionally circumstances may arise in which it becomes necessary to restrict competition toa degree. This is discussed in more detail in Chapter 7, Competition Requirements.

c. Guard Procurement-Sensitive InformationOne essential element for maintaining full and open competition is ensuring that all potential offerorshave access to the same information about the acquisition. At the same time, offerors do not needaccess to all information about the acquisition. The agency must withhold some information fromofferors, such as the Government's independent cost estimate and the source selection materials, inorder not to influence their proposals. No offeror should be given information about the number oridentity of other offerors or the nature of their technical offerings or cost. An offeror with access toinformation that others lack has an unfair advantage. This offeror could use the information to changeits proposal and win the contract, even though another offeror's proposal would have provided morevalue to the Government. Agency personnel need to exercise special caution if contractor personnel areworking on site at the agency. Most often the disclosure of information happens accidentally: anoverheard remark, a report lying open on a desk, a slip during a phone conversation. However, laws

Page 14: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

prohibit the release of procurement-sensitive information (e.g., the Procurement Integrity Act, TradeSecrets Act). The release of such information has sometimes led to protests, scandal, adverse publicity,Congressional investigations, and criminal and civil penalties.

d. Identify, Agree on, and Document Assumptions and ConstraintsIdentify explicitly assumptions used in the analyses supporting the acquisition. Assumptions can beboth quantitative (e.g., the contract life will be five years) and qualitative (e.g., the agency will need tolease extra office space before awarding a contract for on-site software development services).Program, IRM/technical, and contracting staff should agree that the assumptions reflect the agency'sbest estimate. Document assumptions so they can be discussed, analyzed, and modified if necessary.Also, test the sensitivity of the analyses and conclusions to changes in the assumptions. Finally,document the constraints that affect the software development services acquisition. These include anypolicy, staffing, organizational, or budgetary factors that limit the agency's discretion. For example, theagency's need to monitor the software development process closely may be constrained by lack ofspace for on-site contractor personnel. Documenting the constraints and their impacts on theacquisition is essential.

e. Involve All Three Staff DisciplinesChapter 1, Introduction, points out that this Guide was written for three audiences: program,IRM/technical, and contracting personnel. All three staff disciplines bring knowledge and skills to theacquisition process. The program staff understands the requirements to be met. The IRM/technical staffhelps translate these requirements into the software development services to be acquired. They do thisbecause they are familiar with the types of software development services available in the marketplace.The contracting staff provides liaison with the vendor community and is generally more aware of anylegislative and regulatory constraints affecting the acquisition. The CO manages the contractual aspectsof the acquisition. Although one group may have the lead responsibility for a particular step in theacquisition life-cycle, it should seek the advice and counsel of the other two during that step. Inaddition, each group should be kept informed of the status of the acquisition throughout its life-cycle.Sharing information will help keep the acquisition running smoothly and identify any potentialproblems or constraints early, when they are easier to handle. Parallel involvement and review can alsoshorten the acquisition lead time.

f. Consider Agency-Specific Requirements or RestrictionsThis Guide is written for Governmentwide use. It describes legislative and regulatory constraints andacquisition procedures that apply to every agency. In addition, an agency may have its own specificdocumentation requirements, review and approval process, or contracting mechanisms (e.g., anagencywide requirements-type contract already in place). Review and understand the agency's internalprocedures and situation before acting on the advice and guidance provided in this Guide.

g. Begin the Acquisition Process EarlyThe acquisition process for any type of FIP resources may be a lengthy one because of the legislativeand regulatory framework in which it operates. To protect the interests of the taxpayers, theGovernment, and industry, agencies must fully analyze and document their requirements. They mustweigh and document the benefits and costs of alternatives. They must follow review and approvalprocedures -- both internal and external to the agency. They must solicit bids or proposals, giveofferors time to respond, and evaluate the bids or proposals carefully. If an offeror files a protest, itgenerally must be resolved before the contract is awarded or the software development services can beprovided. Each of these steps takes time. Even in an acquisition of software development services for asmall system, months may elapse between the initial determination of the need and the acquisition ofthe software development services. For large negotiated acquisitions of software development servicesto support mission-critical systems, the elapsed time could be 18 to 36 months. An agency also musthave funds available to acquire the software development services. The normal budget lead time is two

Page 15: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

years. Take these lead times into account in planning the acquisition. The length of time an acquisitionrequires often surprises program and IRM/technical staff. An agency can minimize frustration bybeginning the process early, before the need for the software development services or the resulting newsystem becomes acute, and anticipating the documentation needs for future acquisition activities duringthe early stages of the process.

2.5 PROTESTSProtests challenge contract awards or other actions of agencies on specific acquisitions. The number ofprotests has increased dramatically since the passage of the Competition in Contracting Act (CICA).Protests occur when an interested party challenges the agency's solicitation document, objects to acontract award or proposed award, or perceives a procedural violation during the acquisition process.Many protests assert that an interested party has been or will be wrongfully excluded fromconsideration for award. Protests are usually filed with the agency, the General Accounting Office(GAO), or the General Services Board of Contract Appeals (GSBCA). While the procedures of andremedies available from each of these organizations differ, a successful protest could result in one ormore of the following: o Suspending, revoking, or revising the agency's procurement authority oModifying or terminating the contract o Withholding exercise of options under the contract o Askingfor new Best and Final Offers (BAFOs) or reopening negotiations o Amending the solicitation orissuing a new one o Awarding protest costs, proposal preparation costs, and attorney's fees o Awardingthe contract to the protestor These are serious consequences and may prevent the agency fromacquiring or using the software development services until the protest is resolved. Agencies can avoidmany protests by anticipating the potential impact of decisions made while developing and executingthe acquisition. To conduct successful acquisitions, the agency should: o Eliminate unnecessarilyrestrictive specifications o State the agency's requirements clearly and exactly o Develop fair andmeaningful evaluation factors and apply them during source selection o Provide all potential offerorswith the same information o Develop and follow an explicit source selection plan o Documentdecisions made throughout the acquisition life- cycle, particularly those restricting competition oFollow all pertinent Government and agency laws, regulations, and procedures o Review past GSBCAand GAO protest rulings before completing the FIRMR studies and source selection plan A GUIDEFOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 3: SPECIAL CHARACTERISTICS OFSOFTWARE DEVELOPMENT SERVICESAcquisitions of software development services, commercial software, and Federal informationprocessing (FIP) support services are similar in a number of ways. Two other guides in this series, AGuide for Acquiring Commercial Software and A Guide for Acquiring FIP Support Services, discussthe latter commodities. However, each commodity also has unique characteristics that must beidentified and addressed to conduct a successful acquisition. This chapter discusses some uniquecharacteristics of software development services acquisitions that program, Information ResourcesManagement (IRM)/technical, and contracting personnel should know about. However, thesediscussions are overviews, not comprehensive treatises. A full, detailed discussion of all aspects ofsoftware development is beyond the scope of this Guide. For this reason, IRM/technical and/orprogram staff experts in software development should always be involved in an acquisition of softwaredevelopment services. This chapter assumes that the Government will acquire the needed softwaredevelopment services through contracting. If an agency uses another acquisition method (e.g., a GSAassistance program), the Government developer would typically assume contractor responsibilities.

3.1 HOW SOFTWARE IS DEVELOPEDSoftware development is still largely a labor-intensive process. Although automated tools andthousands of books and training courses are available to assist design, programming, and testing, theprocess requires craftsmanship by individuals or small teams of developers. Standardization in

Page 16: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

software design has not evolved to the degree that it has in hardware design, and software cannot bemanufactured on an assembly line. This section provides an overview of the typical steps, techniques,and tools used to develop software. The combination will vary among development projects,depending on an agency's internal development standards and the size and complexity of the softwaresystem.

a. Typical Life-Cycle StagesMost software goes through several identifiable stages from initial requirements to operational system.Sometimes these stages are explicitly defined and followed rigorously, especially for large systems thatneed the efforts of many people. For smaller systems, some of the stages may be extremely short, veryinformal, or not conducted at all. Some or all of the stages may be conducted by agency personnel,some or all by a contractor. The names of the stages, their duration, the allocation of developmentactivities among them, and the organization responsible for carrying them out vary for different life-cycle management methodologies and for different projects. The explanation presented in this Guidedescribes only one of many possibilities, but it represents a common division of stages. Chapters 13-16provide details on each stage. (1) Functional and Data Requirements In the functional and datarequirements stage, the agency and/or contractor defines in detail the system requirements, includingboth the automated software and related manual activities/procedures. However, this is not the firsttime the agency has defined requirements. It is an iterative process. Before this stage, during overallFIP resources planning, the agency developed a high- level view of its information needs. After theagency identified specific software initiatives, it developed more detailed requirements for thecapabilities the software must have, the workload it must support, etc. However, during these initialrequirements analysis activities, described in Chapter 5, the agency knows only that it needs software.It has not yet analyzed whether commercial software can meet its needs or whether it requires custom-developed software. At the beginning of the development life-cycle, the agency or the contractordefines the requirements for the software's capabilities and features at much greater levels of detail.Agency program and IRM/technical staff must be deeply involved in identifying detailed requirementsto ensure the software meets the agency's and users' needs. The two primary classes of requirementsare: o Functional -- These describe aspects such as: oo The flows of data to, from, and within thesoftware oo The sources or recipients of the data oo How the software manipulates or transforms thedata oo The workload volumes for the data flows oo Performance requirements (e.g., response times)for specific functional processes oo The data contained in user interface/data entry screens and outputreports, potentially including their physical layout The functional requirements can be described bynarrative, graphics (e.g., data flow or process flow diagrams, structured flowcharts or matrices), or acombination. o Data -- These requirements describe the data the system will maintain. Data modelsmay be developed and the data normalized to eliminate unnecessary elements. Often, the developercreates a data dictionary, defining each element in terms of name, attributes (e.g., alphanumeric,logical), length, format, permissible values, ownership (i.e., who can create the data element, who canmodify it), and so forth. The functional and data requirements may be called by other names, especiallyif graphical representations (e.g., data flow diagrams) divide the software into logical modules.Different methodologies call this step the system concept, conceptual design, or logical design. Beforemoving to the next stage, the agency ensures that the technical environment (i.e., hardware, systemssoftware, telecommunications) has been specified for both the development and the operation of thesoftware. This should occur during the requirements analysis described in Chapters 5 and 9, wellbefore the development contract is awarded and the development process begins. (2) Design After theagency or the contractor establishes the functional and data requirements, the contractor knows "what"the software must do. In the design stage, the contractor defines "how" the software will do it.Frequently, more than one solution is available, so the contractor must consider several alternativephysical methods for implementing the functions and managing the data, then choose the one that willmost effectively meet the contract requirements. At this point in the development life-cycle, thesoftware moves from logical to physical structures (programs and data files). The contractor definesprograms to carry out specific system functions and divides the data into data stores, files, or databases. In presenting the design graphically, the contractor can use different representations (e.g., data

Page 17: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

or process flow diagrams, flowcharts). Several methodologies refer to this first portion of the designstage as the general design. During the next portion, detailed design, the contractor writesspecifications for the programs and data files, providing the level of detail programmers need forcoding. The specifications can be represented in several different ways, depending on the life-cyclemethodology used [e.g., hierarchical input-process-output (HIPO) charts, process action diagrams,Structured English, pseudo code, Warnier-Orr diagrams]. Computer-Aided Software Engineering(CASE) tools can automate much of both the general and detailed design activities, including graphicrepresentations, narrative, and pseudo coding. The agency's primary responsibilities typically lie in thereview of written design documents, which may also include graphical representations, and plans (e.g.,Test Plan). The design methodology should include periodic review by users to ensure that theemerging design continues to meet agency needs. To assist this review, the agency may want toconstruct a manual or automated Requirements Traceability Matrix (RTM) that maps functionalrequirements to physical "configuration items," such as programs or databases. Then, whenever aphysical software component is reviewed, audited, or tested during the development life-cycle, theagency can use the RTM to identify the related functional requirements. (3) Build During the buildstage, programmers write or generate the software code (i.e., instructions the computer canunderstand). These instructions control the various hardware, systems software, andtelecommunications components. The programmers usually develop the software components insections from smallest to largest: first program units, then modules (groups of units) supporting majorprocesses, then interfaces among modules and with external systems. The programs are debugged (i.e.,errors are detected and removed) and documented. Finally, the entire software system is made ready fortesting. The agency's primary activities during this stage include conducting regular audits,walkthroughs, and reviews of build stage products (e.g., code walkthroughs), as well as monitoringstatus and test reports submitted by the contractor. Different methodologies may refer to this stage asconstruction, coding, or programming. (4) Test Testing takes place in two major stages. First, thedevelopment contractor conducts its own tests of the software. Some life-cycle methodologies considerthis testing part of the build stage. The contractor conducts unit tests, integration tests, and systemtests, correcting errors as they surface. The contractor may also conduct regression testing, ensuringthat correcting a particular part of the software does not cause problems in other parts. The contractorshould conduct testing in accordance with a Test Plan that the agency reviewed and approved. Theagency monitors the contractor's testing process and results, ensuring that the contractor conductsthorough testing consistent with the approved Test Plan and corrects any errors. In the second stage oftesting, the agency takes a much more active role, conducting its own system tests to determinewhether the software meets the contract requirements and can be accepted for production use. Theagency may duplicate some of the contractor's tests using its own test data. The agency may alsovalidate performance with benchmarking tests. (5) Implement The implementation stage includes pre-installation activities (e.g., site preparation, user training), installation of the software in the agency'shardware/systems software environment, and post-installation activities such as converting data fromthe old software systems to the new. If the software is installed at multiple sites, the implementationactivities may take place in repeated waves. Depending on the tasks defined in the contract, theprimary responsibility for software implementation may lie with the contractor, the agency, or acombination of the two. (6) Operate and Maintain When the software has been successfullyimplemented at all sites, and users and operators trained to use it, the system moves into the operationand maintenance stage. The software is now stable, with few changes except to correct lingering errorsor to improve performance (i.e., tuning). The software may also require modification if the underlyinghardware, systems software, or telecommunications environment changes. As new requirementsdevelop, the software may be enhanced to add new functions, features, or capabilities. Softwareoperation typically falls under the agency's responsibility. Maintenance may become the responsibilityof the development contractor, if required by the contract. Alternatively, agency personnel may assumemaintenance responsibility. The decision will be based on many factors, such as the availability oftrained personnel within the agency, other sources of maintenance services (e.g., established sources ofsupply or a separate contract), for this system or a group of systems.

Page 18: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Tools Used in the Software Development Process(1) Programming Languages Programming languages are the primary tools of software development.Using these languages, a programmer writes code telling the hardware and systems software what todo. Dozens of programming languages exist, some proprietary for a specific product or platform,others in wider use. A. Terminology Agency personnel involved in acquisitions of softwaredevelopment services need to understand some of the terms used to describe programming languages:o Source code -- Instructions that the programmer writes. They must be translated into a formunderstandable by the machine (i.e., the hardware and operating system) before they can be executed. oObject code -- Instructions translated into machine language. For the hardware to execute theinstructions, a further translation into binary code is required. o Compilers, assemblers, and interpreters-- Programs that translate the source code into object code. Compilers typically convert large sectionsof code at a time, saving it for later execution. Interpreters translate program commands one at a timeduring execution of the program. Compilers may generate assembly language, an intermediate level,first, then translate it into machine language. Figure 3-1 illustrates the difference between source code,assembly language, machine language, and machine binary code. For some particularly technicalsoftware functions, a few programmers may need to write assembly language directly. However, inmost cases, programmers can rely on the "higher level" source code languages. Differences BetweenSource and Object Code B. Third and Fourth Generation Languages Since invention of the computer ahalf-century ago, programming languages have gone through several generations. Currently,programmers use third and fourth generation languages. The mostly widely used third generationlanguages (also known as procedural languages) include BASIC, COBOL, C, FORTRAN, PL/1, andPASCAL, which have fairly rigid structure and syntax. They are most useful for large, batch-orientedsystems or systems with many users [where efficiency and central processing unit (CPU) utilization areimportant], or where many files are interacting or complex algorithms are used. Fourth generationlanguages (4GLs) provide much more flexibility, potentially reducing dramatically the time needed forcoding. However, no 4GL has yet achieved the wide use of COBOL or other popular third generationlanguages. Many are proprietary, linked to a particular database management system (DBMS)environment. In fact, one of the primary advantages they offer over third generation languages is theirpowerful data query capabilities. Some 4GLs also include prototyping capabilities, permitting fasterdesign and development of the user interface portions of the software. (2) Automated SoftwareDevelopment Until the last decade, software development was a largely manual process, with only afew automated tools (e.g., text editors, debuggers) to support system designers and programmers.However, recently the software industry has produced a large number of automated tools supportingevery stage of the development process. While these tools offer the opportunity for greater productivityand faster development and implementation of systems, many organizations find that their greatestvalue is in improving the quality of the software developed and, in the more rigorous documentation ofthe software, permitting easier maintenance and reuse. A. Computer-Aided Software EngineeringCASE tools cover a wide range of software designer and programmer productivity aids. In its broadestsense, CASE is commercial software that aids in the analysis, design, development, testing, andmaintenance of computer programs and their documentation. The CASE tool captures plans, models,and designs and stores the information in a central database, repository, or encyclopedia. CASE toolsare divided into four major categories. o Upper CASE -- These tools support the front end developmentprocesses, such as planning, analysis, requirements, and design. o Lower CASE -- These tools supportthe back end development processes, such as building the physical databases and coding theapplication programs that access them. o Integrated CASE (I-CASE) -- These tools, typically providedby a single vendor, include both the Upper and Lower CASE components, providing a completedevelopment environment. o Component CASE (C-CASE) -- These tools consist of Upper and LowerCASE components from multiple vendors integrated to provide a complete development environment.Unlike I-CASE tools, which are often proprietary, C-CASE offers an open system developmentcapability and the ability to match the capabilities of a particular CASE component to the agency'srequirements. Upper CASE tools are currently the most mature and provide the greatest functionality.Many products support specific Lower CASE functions, but fewer tools support all back enddevelopment processes in a consistent, well-integrated manner. I-CASE and C- CASE tools offer

Page 19: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

promise and are evolving quickly, but no product yet offers complete automation of the developmentprocess. CASE tools work best teamed with a software life-cycle management methodology. SomeCASE tools are designed to work best with specific methodologies. CASE tools can include thefollowing components: o Project management support o Table and matrix manipulation tools tosupport strategic and tactical planning o Graphics tools for design elements such as data flow diagrams,models, and process action diagrams o Prototyping tools for creating user interface screens and reportsand for modeling or simulating system processes o Requirements traceability tools to track agency anduser requirements to particular software modules or programs o A data dictionary o A centralrepository for all design information o Design analyzers (e.g., for finding outputs without relatedinputs, for normalizing data) o Automatic code generators o Testing tools Although CASE tools offerproductivity savings and the opportunity to implement systems more quickly, many organizationsderive their biggest benefits from improving the quality of the developed systems and ensuring thecompleteness of their documentation. The growth of I-CASE and C-CASE tools also offers thepromise of easy portability of software among platforms. Upper CASE components need to capture therequirements and design information only once. Then, to move from one platform to another, theagency would regenerate the application using the Lower CASE components for the new targetplatform. B. Code Generation Code generators automatically create program code, data definitionlanguage, and utility control language directly from detailed design information or programspecifications. While usually part of an I-CASE or C-CASE system, independent code generators alsoexist. The code generator toolkit can produce both source code and, using compilers and assemblers,object code. The toolkit usually also includes editing, debugging, and unit testing tools. C. CodeAnalysis Another class of tools analyze the code after its development. The tools can identify bothover-used portions of code that cause bottlenecks and portions not used at all. Code profilers monitorthe code while it is running, determining which portions execute most often, take the longest to run, orboth. Programmers can use the results to restructure the code so that it runs more efficiently. Profilersare most useful for analyzing code developed by large numbers of people working on large softwaresystems whose efforts may not have been tightly coordinated or controlled. Profilers are also useful foradvanced software using object-oriented techniques or graphical user interfaces (GUIs). The code forsuch systems may become too complex for a single person's full understanding. Another class of codeanalyzers review the code in static mode (i.e., while not running), looking for "go to" statements, useof subroutines, etc., then represent the results graphically. In general, well organized, well-written codeexecutes faster and is easier to maintain than "spaghetti" code (i.e., poorly organized and inefficientcode). Static code analyzers can also review code for adherence to development standards, such asnaming conventions and formats. Finally, code analyzers can produce "metrics" that monitor andmanage the development process. Chapter 9 describes metrics in greater detail. D. ConfigurationManagement Configuration management is the process of controlling changes to parts of the softwareand related items (e.g., databases, documentation) so that they stay consistent and do not get placed inproduction before they are ready. It is a complex undertaking, especially for large and complexsoftware systems. A number of automated tools are available for keeping an inventory of the variousportions of the software, allowing change control managers to define libraries, conditions for check-inand check-out, and so forth. The tools can also help assess the magnitude and impacts of a proposedchange. E. Testing When software systems were simpler and smaller, programmers could hand testmost or all of their code. However, today's complex systems, which incorporate elements needing morelines of code (e.g., GUIs), require the type of complex testing that only automated tools can reasonablyaccomplish. A number of these tools are available to support the testing process: o Project management-- This software can schedule tests, manage testing resources, and monitor progress against the testplan. o Data capture and replay -- Useful for "black box" testing to focus on inputs and outputs ratherthan the internal algorithms of a program. These tools capture keystrokes for test scripts and theprograms' outputs to establish baselines. The tools can then replay the data to run the tests, comparingactual versus expected results and reporting on data mismatches, error counts, and response time. oProfilers -- Using code profiling tools, testers can determine which portions of the software code testshave exercised. o Interactive testing/debugging -- These tools are specifically designed for the interimtests used as a part of good programming practice. Programmers can insert breakpoints, display datanames as they are accessed, and record execution paths of programs and subroutines. o Test data

Page 20: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

generators -- Particularly useful for large software systems and for performance testing, these toolsautomatically generate large numbers of "dummy" records with the required layouts andcharacteristics. o Test case generators -- These tools, which are relatively new, identify and design testcases based on design specifications. o Traceability tools -- These tools map the functional and datarequirements to specific software components (e.g., programs) for tracking compliance withrequirements during testing. An RTM is an example of such a tool. F. Documentation A key factor inthe successful support and maintenance of software is thorough documentation. Not only mustmanuals be developed for users and operators, but the software itself must be documented in terms ofpurpose, design, source code, logic, etc. Good programming practices dictate that some of the code beself- documenting through insertion of comment statements and explanations. A number of automatedtools may provide support for system documentation. Improvements in word processors and texteditors make it easier for programmers to write, edit, debug, modify, and add documentationstatements to their code. Also, CASE tools can capture much of the requirements, general, and detaileddesign information in electronic form as it is created. (3) Prototyping A key factor in the successfuldevelopment and implementation of software is that it meets the needs of its users. However, programpersonnel, managers, and sometimes even IRM/technical personnel have a difficult time envisioninghow software will look and operate by reviewing design documentation. Thus, prototyping (i.e.,creating a skeletal, but operational, model of the software or one or more of its elements) is gaining awide following in systems development. By showing the prototype to users and managers and thenvarying the elements based on their comments, developers can accelerate the requirements and generaldesign tasks. The growth of prototyping has been facilitated by the transition from traditional batchsystems to more interactive ones and by the wide availability of CASE tools, many of which haveprototyping tools built in. The prototype can include one or many software components. The developercould prototype just data entry screens, screen dialogues, GUI elements, internal processes, or datacontent. Alternatively, these and other elements could be integrated into a model of the overallsoftware system. Finally, the developers could build a "proof of concept" prototype showing that aparticular combination of custom-developed and commercial off-the- shelf (COTS) tools can worktogether. Two different approaches apply to prototyping: "throw-away" and "evolve." In bothapproaches, developers mock-up the software. Users and others then exercise the mock-up as though itwere the operational system. They feed back to the developers any needed changes. The developerschange and users re-exercise the mock-up. Through an iterative process, the developer modifies theprototype until it meets the users' requirements. In the throw-away approach, the prototype software isnot implemented. Instead, it serves merely as the specification to build software with more traditionalmethods and tools. The prototype software is thrown away after the system becomes operational. Oneadvantage of throw-away prototyping is that the prototype and the production version need not bedeveloped on the same hardware and systems software platform. The developers can use specialprototyping software (sometimes included as part of a CASE tool) to develop the prototype quicklyand easily. In the evolutionary approach, the prototype is gradually enhanced and modified (e.g.,adding internal and batch processing) to become the production system. One concept currentlyemerging, Rapid Application Development (RAD), uses the evolutionary approach. RAD combinesprototyping techniques and extensive user interaction to shorten the development period dramatically.The advantage of the evolutionary approach is that the system usually gets into users' hands morequickly than by following a standard development methodology. For that reason, it is an increasinglypopular part of software development projects. However, prototyping also has the following potentialdisadvantages: o Because the prototype design focuses on user interface elements and is thusincomplete, it may result in a highly inefficient system. o Design documentation may not be asthorough, making later maintenance more difficult. o Program staff could originally advocate a throw-away prototype, then want to keep it as an evolutionary system. This not only raises the potentialissues listed above, it may also subvert the contract scope (e.g., a requirements analysis contractormight actually perform software development by calling it prototyping). While none of these is a "fatalflaw," an agency should focus on mitigating such problems if it plans to use prototyping. The agencyshould ensure that the selected prototyping method results in comprehensive and detaileddocumentation. Proper use of CASE, C-CASE, and I-CASE tools can assist in producing efficient,well- documented systems that are easy to maintain.

Page 21: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

3.2 SOFTWARE LIFE-CYCLE MANAGEMENT METHODOLOGIES

a. Methodology DefinedMethodology, as used in this Guide, refers to the standards and procedures that affect the planning,analysis, design, development, implementation, operation, maintenance, and disposal of software.Thus, the Guide uses the term life-cycle management methodology, rather than developmentmethodology, to avoid a perception that the methodology should only focus on the design and buildstages.

b. What a Methodology Should IncludeMany different life-cycle management methodologies exist. An organization can buy or use an existingmethodology or write its own. The focus, names of components, and division of activities vary amongmethodologies. Nevertheless, most of them include at least the following components: o Stages -- Themethodology should divide the life-cycle into stages, noting which activities fall in each, and include aprocess for determining when the software can move to the next stage. o Milestones -- Themethodology should define major milestones (e.g., Preliminary Design Review, Test ReadinessReview) in each stage. Each milestone should specify a deliverable (e.g., a written report, briefing, testresult, portion of code). The methodology should also include milestones for when the agencyapproves completion of one stage and movement into the next. o Content of Deliverables -- Themethodology should define, either by topic or outline, what each milestone deliverable should include.o Evaluation Criteria for Deliverables -- The methodology should describe the mandatory and desirableattributes of deliverables, in terms of both technical and documentation quality. o Ongoing Monitoring-- The methodology should provide for an ongoing monitoring process of walkthroughs and audits inaddition to reviewing milestone deliverables. o Responsibilities -- The methodology should identify byfunction, [e.g., programming, quality assurance (QA), management], the organizational componentsresponsible for each activity. o Reporting and Tracking -- The methodology should provide ongoingtracking of progress against milestones and regular reporting of progress, issues, and problems. o Tools-- If the methodology includes the use of automated tools (e.g., CASE), it should specify them by nameor describe in detail the capabilities and features they should provide. o Standards -- The methodologyshould identify practices and procedures to follow (e.g., naming conventions for programs). Thesecould include both internal and external standards. o Tailoring -- Not all software development projectsare of equal size or complexity. A good methodology should include provisions to tailor the numberand scope of reviews, audits, and testing to the size and scope of the system, permitting a "fast track"for small, straightforward systems. Similarly, the methodology should permit the ability to establishmany interim checkpoints (e.g., draft deliverables, preliminary reviews) for a system developmenteffort that is large and complex.

c. Benefits and Issues of Following a Life-Cycle MethodologyThe primary benefits of following a methodology lie in the discipline it imposes on the management ofsoftware development services throughout the life-cycle. By requiring compliance with a methodology,the agency reduces the likelihood that important development steps will be overlooked or conductedineffectively. In addition, a common set of instructions makes it easier to manage and coordinate theefforts of a large development team. However, a highly structured methodology requiring a lot ofreview and sign-off by various organizations can slow the development process. Delays occurfrequently in the organization's moving to the next step and in resolving disagreements aboutvocabulary, priorities, direction, etc. The bureaucratic slowness of some methodologies has led to theuse of Rapid Application Development that, while increasing the speed of the development process,removes some of the controls traditionally part of life-cycle management methodologies.

Page 22: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

d. Should the Agency Specify a Methodology?When acquiring software development services, the agency should decide whether to require thedevelopment contractor to follow the agency's internal life-cycle management methodology or permitthe contractor to choose its own methodology. The advantage of specifying compliance with theagency's methodology is that Government personnel are familiar with it and more likely to becomeaware early of any contractor problems. In addition, if the software development is a joint effort, withagency staff focusing on one part of the system and the contractor on another, or part of a largerinitiative involving multiple contractors, a single methodology makes sense. On the other hand,allowing a contractor to specify its own methodology may permit faster delivery of the software. Thecontractor's personnel are more familiar with its methodology than the agency's, and the developmenttools and techniques the contractor plans to use may integrate more effectively with its methodology.The agency may want to consider a middle ground. It might specify in the solicitation the mandatoryrequirements (i.e., contents) of a life-cycle methodology, but let the contractor specify its ownmethodology as long as it meets these requirements.

3.3 QUALITYThe Government should always maximize the value it receives when acquiring FIP resources bycontractual or other means. Value is a combination of quality and cost. Cost is relatively easy to define,and the standard measure (dollars) does not vary among the various types of FIP resources (e.g.,software development services, hardware). Quality, on the other hand, is more difficult to define, andthe measures of quality vary among the types of FIP resources. Some resources (e.g., hardware) havestraightforward, easily quantifiable measures of quality (e.g., downtime per month). The measuresdefined for others, including software development services, are more subjective.

a. Two Measures of QualityAn agency can measure quality in two ways. Using the terminology for system testing, these are staticanalysis and dynamic analysis. Static analysis compares the output of the contractor's services topredetermined, concrete measures. For example, an agency can conduct static analysis of softwaredocumentation by answering questions such as: o Does the documentation comply with standards suchas Federal Information Processing Standards Publication (FIPS PUB) 38, Guidelines forDocumentation of Computer Programs and Automated Data Systems? o Are the documents consistentwith the outlines provided in the contractor's proposal? Dynamic analysis, on the other hand,determines the value of the contractor's services. Dynamic analysis of the software documentationanswers questions such as: o Are the documents organized and written in such a way as to facilitateaccess by both casual and detailed readers? o How much are the documents likely to reduce potentialoperator and user error in working with the software? Static analysis provides objective answers toquestions of quality. Different reviewers should reach the same conclusions because they use the samestandard of comparison. Dynamic analysis is more subjective, and opinions may differ amongreviewers.

b. The Importance of Quality in Software Development Services AcquisitionsThe Office of Federal Procurement Policy (OFPP) discusses quality in service contracting (includingsoftware development services contracting) in its Policy Letter 91-2, which is reproduced in AppendixF. The letter emphasizes the use of performance requirements and quality standards in definingrequirements, in selecting a contractor, and in assuring quality after award. This approach helps theagency achieve the established performance quality level and ensure that the Government pays only forservices which meet the standards. The letter grew from OFPP's concern about service acquisitions thatcontain unnecessarily vague statements of work, do not use fixed pricing arrangements often enough,lack quantifiable performance standards, and do not have adequate quality assurance surveillance.OFPP also expressed concern that the Government underemphasized quality vs. price in acquiringservices. Accordingly, OFPP has established the policy that: (1) agencies use performance-basedcontracting methods to the maximum extent practicable when acquiring services, and (2) agencies

Page 23: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

carefully select acquisition and contract administration strategies, methods, and techniques that bestaccommodate the requirements. Agencies must justify using anything other than performance-basedcontracting methods. The requirements for software development contractors described in Chapter 9include several performance- based measures. OFPP defines performance-based contracting methodsas the following: o Statement of work -- Agencies should describe "what" the required output is ratherthan "how" to accomplish the work. The agencies should also consider issuing draft solicitations forcomment so the vendor community can assist in identifying specifications that do not meet thiscriterion. o Quality assurance -- While the software is being developed, agencies should assigncontractors primary responsibility for quality performance rather than relying on cumbersome andintrusive process-oriented inspection and oversight programs. At the same time, agencies shoulddevelop formal, measurable performance standards and surveillance plans for quality, timeliness,quantity, etc. o Selection procedures -- Agencies must use competitive, negotiated contracts whenquality of performance above the minimum acceptable level will enhance the agency's ability toaccomplish its mission and support the corresponding increase in cost. This approach applies to mostsoftware development services contracts. When the minimum acceptable quality is enough, agenciesshould consider sealed bidding. In proposal evaluation and selection, agencies should use factors suchas technical capability, personnel capability, management capability, and past performance. Finally,agencies must examine proposals for cost realism, ensuring that offerors understand the requirementsand that the technical proposals are consistent with the cost proposals. o Contract type -- Agenciesshould choose the contract type most likely to motivate contractors to perform at optimal levels. Fixed-price contracts are appropriate for software development services that-- oo Can be defined in objective,specific terms oo Have limited, manageable performance risk oo Are frequently acquired oo Requireno more than a minimally acceptable level of performance While some software development effortsmeet these criteria, others do not. For the contracts that do, the agencies should define measurableperformance standards, monitor them with surveillance plans, and enforce them. Cost-reimbursementcontracts are appropriate for software development services-- oo That can be defined in less specificterms oo Which involve a high degree of performance risk oo That are complex or unique oo In whichquality of performance is paramount These contracts should provide incentives to motivate thecontractor to perform quality work and to discourage inefficiency and waste. o Repetitive acquisitions -- When recompeting an existing software development services contract, the agency should rely on theexperience gained during the existing contract to define the performance measures. The new contractmust follow the provisions of Policy Letter 91- 2, not just repeat the terms of the existing contract. AsPolicy Letter 91-2 points out, steps agencies can take to maximize quality fall into two categories: oSteps taken to predict quality when soliciting and evaluating proposals and awarding a contract o Stepstaken to enforce quality standards after contract award Each of these categories is discussed below.

c. Predicting Quality Before AwardAt the end of a contract, the agency can look back and see whether it received the quality anticipatedwhen it first decided to acquire software development services. Retrospective analysis is easy, but it isdifficult to guarantee high quality before the work occurs. Including performance and quality standardsin the contract make quality more likely, but the best outcome will result when the contractor performshigh quality work on its own. The question is: How can the agency find software developmentcontractors most likely to provide the quality needed? Since the agency cannot take a retrospectivelook in advance, the most it can do is apply measures proven useful in predicting quality. The softwaredevelopment contractor attributes listed in Chapter 10 include predictors. The quality of proposedpersonnel has particular importance in a software development effort. Other important attributesinclude corporate qualifications, sample deliverables, and offerors' internal quality assurance processes.However, these measures are predictors, not guarantees. For example, although years of experienceusually yield high quality, a recently graduated systems designer from a top university may providehigher quality than a more experienced but mediocre performer.

Page 24: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

d. Ensuring Quality After AwardAlthough the agency may award the contract to an offeror with high quality predictors, it must stillestablish techniques and procedures to ensure that quality software is delivered. Two key methods thathelp assure quality after award are configuration management and testing. Configuration managementis the process of controlling changes to parts of the software and related items (e.g., databases,documentation) so that they stay consistent and do not get placed in production before they are ready.Software configuration management is usually conducted by establishing separate libraries orenvironments for code under development, code under testing, code in operation, etc. Programmerscheck out the code from the library/environment to work on it, then return it when they are finished.Before the contractor submits the software to the Government for acceptance and the agencyimplements the software for production use, it must be thoroughly tested to ensure it is error free andmeets the functional, performance, and other requirements specified in the contract. Testing does notoccur only at the end of the development life- cycle. Rather, design testing should take place beforecoding begins and testing of software components should occur during development. Early detectionof errors permits the contractor to correct them faster and less expensively than errors discovered whenthe developer tests the completed software. Chapters 9, 14, and 16 discuss these methods andadditional quantitative and qualitative performance measures.

3.4 MEASURING PROGRESSProgress and delivery are typically more difficult to measure in contracts for software developmentservices than in contracts for other types of FIP resources, such as hardware and commercial software.It differs greatly from walking down to the loading dock and checking the number of completed unitsdelivered. While the Contracting Officer's Technical Representative (COTR) could determine, througha review of contractor status reports or invoices, how many hours of programming the contractordelivered during a month, this does not directly relate to progress towards completion. In addition, formost software development services contracts, the delivery of operational software usually does notoccur until near the end of the contract, though work leading to it may have been underway for months.If the agency does not carefully and consistently define its requirements, the contractor may spin itswheels for months or try to guess what the agency really wants and guess wrong. Without interimprogress monitoring, the agency may not recognize that a problem exists. Progress measurement isparticularly important in cost- reimbursement and time-and-materials contracts to determine whetherthe agency will receive the full value it anticipates for the estimated cost. For example, the agency mustdetermine not only that 75% of the resources have been expended, but also that 75% of the work hasbeen accomplished. The contract may include mechanisms to gauge the progress and quality of thework. These allow the agency to monitor the contractor's progress, provide direction and approval atregular intervals, and address any performance, schedule, or cost problems before they become major.These mechanisms include: o Periodic status reports and/or progress briefings o Interim deliverables ofpreliminary products and work-in- progress o Design reviews, audits, and walkthroughs o Reports onthe results of the contractor's testing program o Documents providing evidence of the contractor'sanalysis and advice Chapters 9, 12, and 14 describe these mechanisms in more detail.

3.5 INVOLVEMENT OF MULTIPLE CONTRACTORS DURINGSOFTWARE DEVELOPMENTOne aspect of software development unique to the Federal Government is that a single contractor rarelyconducts the entire development life-cycle. Instead, the development process is typically broken intothe "front end" and the "back end." The front end includes the identification of the functional and datarequirements and, perhaps, the high-level general design of the software. The back end includes thedetailed design, coding, and internal testing. Sometimes agency personnel will conduct the front end,with a contract awarded for the back end. In other situations, the agency may award one contract forthe front end and another for the back end. Neither the Federal Information Resources ManagementRegulation (FIRMR) nor the Federal Acquisition Regulation (FAR) requires that agencies split

Page 25: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

responsibilities this way. However, the practice can help an agency address concerns about conflict ofinterest. The majority of the labor (i.e., programmers) and machine resources devoted to softwaredevelopment are employed in the back end. A smaller team of systems analysts and designers typicallyconducts the front end. These economics of software development may create a conflict of interest. Asingle contractor might specify overly complex requirements or designs, hoping to make more moneyduring the back end. Splitting front and back end responsibilities helps ensure objectivity in the frontend. When an agency decides to split a software development effort between two contractors, it mustalso consider the effect on competition for the back end contract. The front end contractor may have anunfair cost or technical advantage due to its familiarity with the system requirements and/or generaldesign. The agency should consider the organizational conflict of interest guidelines in FAR Part 9 anddetermine if it is necessary to restrict the front end contractor's ability to compete for the back endcontract. While the division of development duties among two contractors prevents a conflict ofinterest, it has certain potential negative side effects: o When the back end contractor begins work, itmust go through the same orientation and learning process the front end contractor went through.Going through the learning curve twice takes additional contract resources and slows the developmentschedule. o The back end contractor may use different design techniques or tools than the front endcontractor. Thus, some of the design information may require reformatting or conversion (if it is inautomated form) before the back end contractor can proceed with the detailed design. o Without carefulcoordination of the front end and back end acquisitions, a time lag can result between the end of oneand the beginning of the other because of schedule uncertainties in the solicitation, proposalevaluation, and contract award processes. All this can delay the implementation of the software. oSince the front end contractor does not have to implement the software, the quality of its activities maynot be as high as would otherwise be the case. This means that agency quality activities become moreimportant in guaranteeing the success of the software. o The back end contractor may claim, afteraward, that the front end work was incomplete, infeasible, inefficient, or poor quality, and that theyshould duplicate much of the front end work (at additional cost) to raise it to the standard needed toconduct back end activities. For large, complex software development projects, agencies sometimesuse a separate contractor to supplement the agency's monitoring and review of a developmentcontractor's work. This other contractor, referred to as a QA or independent verification and validation(IV∧V) contractor, can parallel the efforts of agency technical monitoring staff, have fullresponsibility for some monitoring activities, or play the primary hands-on monitoring role, reportingits results and concerns to the COTR.

3.6 SIGNIFICANT CONCERNS IN DEVELOPING SOFTWARE

a. Development of Open SystemsA major factor in today's software development environment, especially in the Government, is themove toward "open" systems. In the early days of computing, the dominance of proprietary hardwareand software products provided little or no opportunity to interchange components or to move dataamong systems. This tended to lock an agency into using the products of a single vendor, even ifcompeting products offered better price, performance, features, or capabilities. To alleviate thissituation, the information technology industry has been moving toward compatible or open systemsbased on generally accepted standards. In December 1990, the Institute of Electronic and ElectricalEngineering (IEEE), Committee on Open Systems, issued the following definition of open systems: Acomprehensive set of international information technology standards and functional standards profilesthat specify interfaces, services and support formats to accomplish interoperability and portability ofapplications, data, and people. The primary goals of open systems are: o To permit organizations andusers to select the hardware, systems software, and applications software components they want from avariety of vendors o To permit easy sharing of data across different systems operating in differentlocations and on different platforms o To allow applications to be "portable" (i.e., easily moved fromone platform to another with little or no rewriting) Open systems standards affect custom-developedsoftware insofar as they define the hardware and systems software platform used for development.

Page 26: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Standards can be either de jure or de facto. De jure standards are those set by an acknowledgedstandards setting body, such as the National Institute of Standards and Technology (NIST), the IEEE,or the American National Standards Institute (ANSI). NIST has set a number of open systemsstandards through its FIPS PUBS. Some of these are now mandatory for FIP acquisitions. FIPS PUBSmoving the Government toward open systems include: o FIPS PUB 127, Structured Query Language(SQL) -- Defining standards for data access o FIPS PUB 146, Government Open SystemsInterconnection Profile (GOSIP) -- Defining standards for telecommunications among systems o FIPSPUB 151, Portable Operating System Interface for Computer Environments (POSIX) -- Definingstandards for operating systems o FIPS PUB 156, Information Resource Dictionary System (IRDS) --Defining standards for data dictionaries o FIPS PUB 158, X Windows System -- Defining standards forGUIs De facto standards are based on wide acceptance of a standard published by an individualvendor. For example, the MS-DOS operating system, although controlled by a single vendor, has beenadopted by hundreds of microcomputer equipment manufacturers and thousands of commercialsoftware vendors. Although the industry continues to move towards open systems, the evolution is notyet complete. For some standards (e.g., POSIX), NIST has certified relatively few products as fullycompliant. In addition, standards frequently focus on a core set of specifications. Vendors may supportthese standards, but by adding their own extensions and improvements for items outside the core, theycreate potential incompatibilities. Agencies with requirements subject to Federal standards must ensurethat contractors comply with these standards.

b. Designing-in Computer Security Throughout the Life-CycleComputer security has become a critical factor in the success of software. Unfortunately, agenciesfrequently address security issues relatively late in the software development life-cycle. Agenciesshould identify computer security requirements during the requirements analysis. The two majorconcerns for computer security are: (1) protecting data from unintentional or accidental alteration ordestruction, and (2) preventing unauthorized access to and/or destruction of data. Computer securityhas greater significance for the success of certain types of software (e.g., agencywide payroll andaccounting), but any software important to an agency's mission has potential computer security impactsand needs. Program and IRM/technical personnel should provide input into security needs andparticipate in developing security policies. Program personnel can identify the crucial data, as well asthe events and people the data require protection from. IRM/technical personnel can help define themechanisms for protecting the data, such as access control, security hardware or systems software,monitoring and audit capabilities, or encryption. Addressing computer security issues throughout thelife-cycle may result in the recognition of other issues. For example, the analysis of data access acrossagency divisions may result in questions of data ownership. Before software enters the operation andmaintenance stage, agencies should establish a complete system security policy. It should includeprocedures for backup and recovery, user authorization and access, virus identification and prevention,and monitoring and reporting.

c. Software License RightsGenerally, the Government owns the software developed under a contract, not the contractor. However,situations may occur in which commercial software components are used in the development orincluded as part of a delivered software system. Usually the decision to incorporate commercialsoftware is deliberate (e.g., a run-time version of a DBMS package is included with an applicationdeveloped using the DBMS' programming language). However, a programmer might embed aproprietary component in the custom- developed application software without the knowledge of thecontractor's management or the Government. Most commercial software is protected by copyright, andsome packages may also be patented, which gives them added protection. When an organizationpurchases a commercial software package, it usually purchases only the right (i.e., the license) to usethe package. The ownership of the intellectual property, the underlying source code, usually remainswith the author or publisher. If a programmer includes proprietary components in custom-developedsoftware without the knowledge of the contractor's management or the Government, the Governmentmight inadvertently place the proprietary software in the public domain by accepting and using the new

Page 27: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

system. The owner of the intellectual property would then have a valid claim against the Government.Because of the potential for such situations, Government personnel should understand softwarelicensing rights. Program, IRM/technical, and contracting personnel should all become familiar withthe terms and conditions typically covered by software licenses and carefully craft all contractspecifications, terms, and conditions to avoid problems after contract award. A Guide for AcquiringCommercial Software, another guide in this series, discusses commercial software licensing in detail.To protect the Government from accepting proprietary software unknowingly, the Contracting Officer(CO) should require offerors to certify in their proposals that they can grant clear title to all softwareproducts delivered. In addition, the CO might consider including indemnification clauses in thecontract to cover claims by intellectual property owners whose software was unknowingly placed inthe public domain because of a software development contractor's actions. The CO should reviewprovisions of FAR Part 27 on Government rights in data and copyrights and include the appropriateclauses from FAR Part 52 to guarantee the Government's rights.

d. Data Administration ConsiderationsMany agencies have established a data administration function to help manage data in manual andautomated information systems. The data administration function is still evolving and does not have aconsistent definition. The data administration function generally involves controlling the acquisition,analysis, storage, retrieval, and distribution of data. It also usually includes defining, organizing,supervising, and protecting the data within the agency. Common goals and objectives of dataadministration include: o Promoting data consistency and standardization by developing standards fordata element names, definitions, values, formats, and documentation in a central data repository and indatabases o Creating an architecture that consolidates the conceptual, logical, and physical models ofthe data to the information needs and business functions of the enterprise o Minimizing duplication incollecting, processing, storing, and distributing data o Improving data management and access throughthe use of appropriate existing and new methods, tools, and technologies IRM/technical personnel canprovide specific guidance to ensure that a software development services acquisition complies withapplicable data administration standards and procedures.

e. Emerging Trends in Software DevelopmentAs in all areas of FIP resources, software development is changing rapidly. Some of these changeshave already been discussed, such as the automation of many development steps. This sectiondescribes other recent trends that may affect software development contracting. (1) Changes in theEconomics of Software Development In the early days of computing, hardware was by far the mostexpensive component of a system. However, through the years, the cost of a given level of computingpower has decreased dramatically. Today one can buy a $1,000 microcomputer that offers moreprocessing power and storage than a multi-million dollar mainframe of 20 years ago. The cost ofdeveloping software, however, has not decreased very much, even with the availability of CASE andother automated development tools. What has changed is the wide availability of relatively low costcommercial software. Agencies can acquire a powerful word processing or spreadsheet program for$300-$500. Thus, when agencies contract for a custom software development project, especially in amicrocomputer environment, personnel are often shocked at the cost ("How can it cost $100,000 for asystem running on less than $5,000 worth of hardware and systems software?"). Many agencypersonnel fail to consider that it costs millions of dollars to develop the commercial software, but thecost was spread over hundreds of thousands or millions of copies. Since, in a custom developmentsituation, only one organization will use the software, no potential for economies of scale exists. (2)Rightsizing With changing hardware economics, many organizations find it is no longer cost-effectiveto retain certain multi-user applications on mainframes in large data centers with all their associatedoverhead (e.g., special environments, systems operation staff). A trend has developed to move theseapplications into other environments, such as networked microcomputers. At one point, the trend tomove to less costly platforms became so pervasive that industry coined the term "downsizing" todescribe it. However, organizations soon discovered that placing applications on smaller, distributedplatforms did not always provide the best solution, especially since networked microcomputers carry

Page 28: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

their own overhead (e.g., network management, correcting errors caused by users not schooled in datamanagement). Thus, the current term used in the industry is "rightsizing," placing an application on themost appropriate platform given the application's nature (i.e., batch or interactive), processing and datavolumes, need for up-to-date data, sophistication, and profile of users. (3) Incremental SoftwareDevelopment Every software development project will include the life-cycle stages discussed earlier(i.e., functional and data requirements, design, build, test, implement, operate, and maintain). However,development models differ in the way they organize these stages (i.e., the order in which the stages areperformed) and the criteria for progressing from one stage to the next. This Guide bases its descriptionof software development primarily on the traditional waterfall model. That model relies on a thoroughand accurate description of users' needs at the outset and complete documentation for each stage beforeproceeding to the next. While this model can efficiently support many software development efforts, itdoes not have sufficient flexibility for projects with a high degree of risk and uncertainty. Incrementalsoftware development models employ methods that build software in incremental steps. Most of thesemodels repeat the life-cycle stages for increasingly complete versions of the developed software.Development occurs in a series of progressive steps that provide opportunities for the agency anddevelopment contractor to evaluate the project, incorporate change, and refine the product in process.This helps the agency to: o Ensure that the effort will meet user requirements o Recognize andminimize cost and schedule risks o Accommodate technology advances Agencies may findincremental methods particularly suitable when: o User needs are complex and difficult to define at theoutset of the project o User needs may change as a result of automation (e.g., automation may lead tochanges in an organization's business processes) o The development effort will cover a long periodduring which technology may evolve in unexpected ways Various incremental methods are availablethat an agency or development contractor can apply or tailor as needed for a specific project. The U.S.Patent and Trademark Office (PTO) has designed and implemented a managed evolutionarydevelopment method that is an extension to the waterfall method. It capitalizes on the rigor of thewaterfall approach while increasing the ability to handle risk and uncertainty. The PTO ManagedEvolutionary Development Guidebook describes this example of incremental systems development inthe Government. Readers can obtain a copy of the Guidebook by writing or calling: U.S. Patent andTrademark Office 2121 Crystal Drive, Suite 1004 Arlington, VA 22202 (703) 305-8782 (4) InteractiveDesign Techniques A basic tenet of defining requirements and designing systems is to involve theusers. Traditionally, developers have done this by interviews, surveys, and other one-to-one forms ofcommunication. Unfortunately, for large systems, the serial interview/survey process can be lengthy,sometimes lasting a year. In addition, the developers must interpret what they heard about users'activities and work products in software terms, consolidate diverse views in an overall framework, andthen feed back written results for users' review and approval. In their review, users may disagree withcertain aspects of the requirements or design because they did not hear the rationale given by otherusers or, because of the time lag, they forget what they said during their interviews. This makes itdifficult to achieve organizationwide consensus on the requirements or design. To mitigate thisproblem and accelerate the front end development processes significantly, agencies can use interactivedesign techniques. The agency or contractor conducts workshops led by a trained facilitator whoseobjective is to manage the group process and achieve consensus, not to arrive at a pre-determineddesign. Representatives of the different user organizations and the IRM/technical staff participate.During the workshops, which typically last one to three days each, the group systematically walksthrough the requirements or design at ever increasing levels of detail. A workshop could proceed asfollows. If a group's goal requires identifying the software's information flows, it might choose todevelop data flow diagrams. The facilitator draws a box on a chalkboard or flip chart representing thesoftware as a unit and asks the group to identify all the systems and organizations that the softwarewould impact. In the next steps, the group identifies all the data flowing to and coming from eachexternal system or organization. The facilitator does not move to another step until the group agreesthat all external relationships have been captured, thus achieving consensus on that level of detail.Depending on the complexity of the software, either the same or a different group divides it into majorprocesses, showing data flows among the processes and the external flows. This becomes the "levelzero" data flow diagram. Each process is then divided into more specific levels of detail. One or tworecorders, working with the facilitator, take detailed notes during the workshop. They might also use

Page 29: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

electronic aids, such as a word processor or Upper CASE tool. At the end of the workshop, the groupidentifies open issues, assigns each to a participant for resolution, and establishes a date, ideally beforethe next workshop, for resolution. A few days after each workshop, the recorders distribute writtenresults to all participants and other interested people. This solidifies the consensus. Interactive designworkshops can support many other development activities, such as designing data entry screens andreports, brainstorming, or developing plans (e.g., test plan, implementation plan). Using interactivedesign techniques, many organizations have found that they can reduce the overall length of the frontend development process by 50% or more. A number of variants of interactive design techniques exist,many paired with a life-cycle management methodology. The oldest and most widely used technique iscalled Joint Application Design (JAD). (5) Object-Oriented Programming The objective of object-oriented programming (OOP) techniques, a recent development, is to ease maintenance and high-levelprogramming by dividing complex software into simple, essential parts (i.e., "modularizing" thesoftware). By encapsulating a set of processes and data into a standalone "object," the programmer canignore what happens inside the object. The programmer needs to understand only the parameters fordealing with the object. All objects are defined at runtime, and their functions depend on thecircumstances that exist then. Thus, the processing performed by an object and the information fed intoand out of it may vary each time the system is used. OOP is both simple and complex. Understandinghow to design and program individual objects is complex. However, linking a set of pre-definedobjects to develop software to carry out a set of functions is simple since the programmer need notunderstand the internal workings of the objects. The growth of GUIs has accelerated the move to OOPprogramming. In a GUI, users select an object (e.g., a screen icon), then perform an action on it (e.g.,open it) without needing to know much about the object's internal functioning. OOP's primary benefitsare greater reuse of software, easier development of new applications after establishing the objects, andeasier management and maintenance of software. A new object can easily be substituted for an olderobject, as long as it has the same external parameters. OOP tools available today consist both oftraditional languages enhanced with object-oriented aspects (e.g., C++), as well as complete object-oriented environments. (6) Reengineering Reengineering relates closely to business functions that thesoftware system supports. Instead of just revamping or streamlining existing business processes, anorganization can take advantage of improvements in software technology to redesign businessfunctions to improve speed, productivity, and quality. Related goals include simplifying and reducingthe dependencies among organizations and business functions, identifying and eliminating unnecessaryactivities, and using information technology to best advantage. The agency must first define itsbusiness rules: the missions it accomplishes and the processes and activities it uses to accomplishthem. Then, the agency can take advantage of software capabilities to change these processes andactivities. Some agencies may also combine a number of other software improvement concepts underreengineering. These can include performance tuning, changing the underlying hardware or systemssoftware components, streamlining and removing redundancies, and moving functions among softwaresystems. (7) Reverse Engineering Reverse engineering analyzes an older software system that needsreplacement, but whose functions and capabilities the agency still needs. The agency or its contractorfirst tries to understand the software's underlying rules, processes, and data. It uses these asspecifications for a complete redevelopment of the software to take advantage of the latesttechnological advances. Reverse engineering recognizes that, when writing new software to completelyreplace older software, the new system must accommodate countless rules, formats, and definitionscaptured in the old system. Extracting these electronically from the old software is faster and morethorough than trying to invent them again. Reverse engineering is a fairly rigid process. Many of thesteps are manual (e.g., identifying which parts of the application will be reengineered, placing them inlibraries). The two automated parts are: (1) reverse engineering the old source code and databasemodules with software that converts them into rules, definitions, and translation criteria stored in arepository that a CASE tool can access; and (2) generating the new code from the repository rules withthe CASE tool. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Page 30: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 4: OVERVIEW OF THE ACQUISITION LIFE-CYCLEAcquisition begins when an agency establishes a need for Federal information processing (FIP)resources and ends with implementation of the most advantageous alternative. Thus, it involves muchmore than issuing a solicitation and awarding a contract. The acquisition life-cycle, which is separatefrom the software development life-cycle, has four phases. Each phase is divided into several steps.This chapter introduces these phases and steps, which are then discussed in more detail in later chaptersof this Guide. Figure 4-1 summarizes the acquisition life-cycle.

4.1 PLANNING PHASEDuring the planning phase, the agency identifies the overall need for software. The agency usuallycompletes this phase before initiating an acquisition. The planning phase consists of three steps.

a. Identifying Automation NeedsThe agency first identifies its "business profile," including its mission, day-to-day functions, andexisting automated and manual systems. After analyzing its profile, the agency identifies opportunitiesfor using automation to solve problems or to improve the efficiency and/or effectiveness of itsoperations. These opportunities will typically include all types of FIP resources, not just software.

b. PlanningThe agency then plans specific automation projects that include one or more of the identifiedopportunities. Some of them will be software projects. As part of this process, the agency submits afive year automation plan to the Office of Management and Budget (OMB) and GSA. The agencyshould decide whether to proceed with each project by comparing its costs and benefits over the life-cycle. These should include not only the agency's own costs and benefits, but also those the agencybelieves would accrue to other agencies, industry, and the public.

c. BudgetingThe agency budgets funds for implementing the approved projects. Some projects can be accomplishedin a single fiscal year, others stretch over several years.

4.2 ACQUISITION PHASEThe acquisition phase begins with a requirements analysis and ends with a contract award or utilizationof a noncontractual option.

a. Requirements AnalysisA successful software acquisition begins with a comprehensive requirements analysis in which theagency identifies the functional, technical, and other requirements for the software. Functionalrequirements focus on the needs of users and the agency. Technical requirements describe the FIPenvironment in which the software must operate. Other requirements (e.g., budget) describerequirements imposed by external sources.

b. Analysis of AlternativesAfter the requirements analysis, the agency determines whether already-existing software (e.g.,commercial, another agency's) can meet its needs or whether it requires custom-developed software. Ifthe agency requires custom-developed software, it next decides the best method for acquiring thesoftware development services. Acquisition alternatives include choosing between noncontractual andcontractual methods. The analysis of alternatives usually begins with a market survey to gatherinformation about the competitive environment, industry contractual prices, types of services available,

Page 31: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

etc. The Acquisition Life-Cycle The agency also analyzes technical options for how the services willbe performed (e.g., whether the development contractor will work on-site or off-site, whether to specifywhich development tools the contractor will use).

c. Developing the Source Selection Plan (SSP)When contracting by negotiation, the agency develops an SSP. Designed for internal use, it describeshow the agency will evaluate offerors' proposals and select a contractor. The SSP describes theorganization of the source selection team, the steps in the source selection process, and the schedule. Itincludes a description of the evaluation process, methodology, and techniques; the manner by whichevaluators will express judgments about the offers; and the standard for each judgment. It alsodiscusses the relationship between cost and technical considerations that the agency will use in makingan award.

d. Developing the SolicitationFor contracting by sealed bidding or negotiation, the agency develops a solicitation that specifies theGovernment's requirements for both the software and the software development contractor, and asksthe vendor community for bids or proposals. The solicitation will be either an Invitation for Bids (IFB)or a Request for Proposals (RFP), depending on the contract method chosen. RFPs are more commonfor acquisitions of software development services because these acquisitions often do not meet theconditions for sealed bidding (e.g., the award must be based on price and price-related factors alone).The agency assembles all background information needed to inform vendors fully (e.g., data flowdiagrams) and includes the documents as attachments or places them in a solicitation library thatvendors may visit. The agency must also include in Section M of the solicitation, all the factors it willuse in selecting the contractors. Section M and the SSP must be consistent. For contracting bynegotiation, refer to the procedures described in Chapter 10. For sealed bidding, small purchases, andacquisition from established sources of supply, refer to the procedures described in Chapter 11.

e. Issuing the SolicitationFor small purchases less than $25,000, the agency will usually contact vendors by phone for technicalinformation and price quotations. For contracting by sealed bidding or negotiation, the agency willadvertise the solicitation in the Commerce Business Daily (CBD) and distribute the solicitation tovendors who request it. While the solicitation is "on the street," potential offerors may ask forclarifications and changes. The agency will answer questions and may amend the solicitation. Allvendors who received a copy of the solicitation must also receive copies of questions, clarifications,and any amendments.

f. Receiving Bids or ProposalsBids or proposals are due on or before a fixed date and time. The agency can consider late bids orproposals only under the very limited circumstances defined in the Federal Acquisition Regulation(FAR) Parts 14 and 15.

g. Source Selection(1) Evaluating Bids or Proposals After the agency receives quotations, schedule contract information,bids, or proposals, it will evaluate the offers, ensuring they meet all mandatory requirements. For smallpurchases, established sources of supply, and sealed bids, the Contracting Officer (CO) can usuallymake an award, without further evaluation, to the offeror who meets all requirements at the lowest cost.For contracting by negotiation, the agency usually performs the additional steps described belowbefore awarding the contract. In contracting by negotiation, the agency may rate technical proposals formerit, using evaluation factors and subfactors established in the SSP and the solicitation. The CO mayask offerors to clarify their proposals, correct deficiencies, or provide additional information for theevaluation team. The agency may also conduct other evaluation activities, such as contacting

Page 32: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

references or interviewing offerors' key personnel. The CO will establish a competitive range, eliminateunacceptable offers, conduct negotiations with offerors, and ask them for Best and Final Offers(BAFOs). (2) Selecting the Contractor For contracting by negotiation, the CO or other Source SelectionAuthority will select the apparent successful offeror using the methodology established in thesolicitation and detailed in the SSP. The offeror must also meet the standards in FAR Part 9 to bedetermined a responsible contractor. (3) Awarding the Contract For contracting by negotiation, theagency will award the contract to the offeror whose proposal is most advantageous to the Government,as determined by the source selection process.

4.3 IMPLEMENTATION PHASEDuring the implementation phase, the agency prepares for and receives the software developmentservices that the contractor will perform according to the contract. Activities related to the softwaredevelopment process itself are described in Chapters 13 through 16.

4.4 ACTIVITIES AT THE END OF THE ACQUISITION LIFE-CYCLEWhen an agency acquires software development services by contracting, the acquisition life-cycle endswith contract close- out. An agency closes out a contract when it has received and accepted all softwaredevelopment services and deliverables specified. Close-out includes making final payment andresolving all claims by or against the contractor. If an agency acquires services through another means(e.g., an established GSA program), the activities at the end of the life- cycle will depend on the termsof that alternative. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 5: REQUIREMENTS ANALYSIS

5.1 INTRODUCTIONThe requirements analysis is an early step in the acquisition life- cycle. Its objective is to define theuser's need for Federal information processing (FIP) resources and, specifically in this Guide, the needfor software. At the end of the requirements analysis, the agency will know the software capabilitiesand features it needs and the technical environment in which the software will operate (e.g., hardware,system software, telecommunications). However, the agency will not know whether it will acquirealready-developed (e.g., commercial) or custom-developed software. The agency will make thisdecision as a result of the analysis of alternatives described in Chapter 6. When the agency determinesit requires software development services, it must then define requirements for the softwaredevelopment contractor, as described in Chapter 9.

5.2 KEY FACTORS FOR SUCCESSFUL REQUIREMENTSANALYSESA successful software development services acquisition begins with a comprehensive and unbiasedrequirements analysis. Following the guidelines below will help make it a success.

a. Identify, Agree on, and Document Assumptions and ConstraintsBecause the requirements analysis comes early in the acquisition life-cycle, the individuals conductingit must make assumptions about the future. For example, they must project the workload the softwarewill support based on a combination of the current workload, expected growth, and anticipated changesin laws or regulations. Agencies should state these assumptions explicitly in the requirements analysisdocumentation so they can be discussed, analyzed, and agreed upon. Ideally, program, InformationResources Management (IRM)/technical, and contracting personnel should review the assumptions. Inaddition, software acquisitions occur within a technical, organizational, political, and budgetaryframework, not in a vacuum. Frequently, constraints are placed on the acquisition, such as available

Page 33: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

funding, the technical expertise of agency personnel, or the ability of current technology to support theagency's requirements. The acquisition team should highlight these constraints in the documentationand seek agreement on them.

b. Devote an Appropriate Level of Effort to the Requirements AnalysisThe time and effort devoted to the requirements analysis should correspond to the expected size andimportance of the acquisition. For example, the agency should devote considerable resources to arequirements analysis for software critical to the agency's mission (e.g., entitlements approval softwarefor an agency with a mission to deliver entitlements to the public), but would not devote the same levelof effort for non-mission-critical software (e.g., software to generate purchase order forms for a smalloffice).

c. Reassess Requirements PeriodicallyBecause the requirements analysis occurs early in the acquisition life-cycle, the agency may obtainnew, more detailed, information during later steps. This might affect the requirements as initiallyidentified. The agency, therefore, should reassess them periodically and document the results of thereassessment in a revision to the requirements analysis. For example, during the analysis ofalternatives, a market survey may show that commercial software meets the majority of the agency'srequirements. With this new information, the agency should reassess the requirements commercialsoftware does not meet and decide whether it can do without them. Alternatively, the agency mightfind that a specific software capability is available only at a very high price. The agency shoulddetermine whether to eliminate or redefine the requirement based on the available budget or theperceived value of the capability to the agency's mission or activities. Agencies should also reassessrequirements at other points in the acquisition life-cycle. For example, an agency might issue a Requestfor Comments (RFC) containing draft software requirements. In their comments, potential offerorsmight point out requirements that are unclear or that they believe restrict competition, offeringalternative ways to achieve the same result. The agency should consider these comments in reassessingthe requirements. Finally, the agency's mission, responsibilities, and information processing resourcesenvironment will change over time. The FIP resources marketplace will also change, offering an ever-expanding range of capabilities. For a complex acquisition, a long period may elapse between theinitial definition of requirements and issuance of a solicitation. The agency should periodically reassessits requirements to consider these internal and external changes.

d. Distinguish Between Mandatory Requirements and Optional CapabilitiesWhen identifying required capabilities, distinguish between those mandatory to the mission and day-to-day activities and those which, while valuable, are not mandatory. For example, the ability to allowa user to specify search criteria may be essential for library software, but the ability to save thesecriteria for future sessions may not be as essential and, thus, could be optional.

5.3 ROLES AND RESPONSIBILITIESIn all phases of the acquisition life-cycle, program, IRM/technical, and contracting personnel havewell-defined roles to perform and overlapping roles they share with the other disciplines. The threegroups should work together toward common goals to facilitate the acquisition process.

a. Program PersonnelThe program manager should take the lead in preparing the software requirements analysisdocumentation. Agencies should involve end users in identifying requirements. This is particularlyimportant for software because the users interact directly with it more than they do with other FIPresources (e.g., operating systems software, hardware). Mechanisms to involve the users includeinterviews, workshops, written surveys, and prototyping.

Page 34: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. IRM/Technical PersonnelIRM/technical personnel help program personnel define the software functional requirements, plan andbudget for software projects, and conduct market surveys. For example, they can translate neededcapabilities into requirements that provide suitable bases for developing software design specifications.They can also identify capabilities that current technology cannot yet support. They typically take thelead in defining the technical environment requirements.

c. Contracting PersonnelBecause program and IRM/technical personnel typically perform software requirements analysisactivities, contracting personnel have limited active involvement. However, since the agency mustcomplete and approve the requirements document before issuing a solicitation, the individuals whoprepare the analysis should request comments from contracting personnel.

5.4 PRELIMINARY STEPSThe agency should have completed long-term planning for its information technology needs anddocumented it in a five-year plan before beginning a requirements analysis for a specific softwareproject. If not, the agency should conduct this planning now. It could focus only on individualorganizational units or programs or the entire agency; however, for simplicity, this chapter uses theterm "agency" to cover all these cases. This section briefly describes the planning steps.

a. Determining Overall Information NeedsThis step determines which of the agency's objectives FIP resources can address. As in all FIP resourceplanning, the agency starts by defining its business profile. Then, based on the profile and anassessment of what its current systems can support, it defines information technology objectives. Itthen determines how FIP resources can help achieve them. (1) Determining the Agency's BusinessProfile The agency should shape decisions about FIP resources to fit its business profile, whichincludes: o Agency mission o Work products o Day-to-day job functions and activities o Informationflows to, from, and within the agency o Current and potential users of FIP resources Program personnelare the chief sources of this information. (2) Determining What the Current Systems Can Support Thenext step is to examine and document the current automated and manual systems that support theagency's mission and activities. The automated systems include both office systems and internal andexternal application systems (e.g., a payroll system). To determine how well a current system supportsthe mission and activities, ask users about real or perceived problems with: o Efficiency oreffectiveness of activities the system supports o Reliability of equipment and software o Timeliness ofdata or turnaround time for reports o Ability to find data o Access to FIP resources (3) DefiningInformation Technology Objectives To define objectives for information technology, look for waysFIP resources can improve how the agency does business. Use the agency's business profile andassessment of current systems to guide the development of these objectives. Initially, do not identifywhat specific resources will be needed, but rather, the mission objectives or outcomes that FIPresources may improve.

b. Planning and Budgeting for Software ProjectsThe second part of planning involves identifying software projects and budgeting funds for them. TheOverview Guide, the first guide in this series, describes in detail the planning and budgeting processesfor FIP resources. (1) Identifying Software Projects When the agency understands its business profile,the capabilities of its current systems, and its information technology objectives, it can determine thespecific FIP resources it needs and group them into projects. Some of these will be software projects.They might support a specific administrative function (e.g., accounting, personnel) or a specific agencyprogram (e.g., entitlements approval). A. Defining Software Requirements The agency should definesoftware requirements at a high level during the planning and budgeting phase and in more detail whendeveloping the requirements analysis documentation. B. Identifying the Technical Environment During

Page 35: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the planning phase, the agency should also determine the technical environment (i.e., hardware, systemsoftware, telecommunications) in which the software will operate. Many possibilities may exist. Forexample, the hardware environment may include an existing equipment facility [e.g., mainframe,minicomputer, local area network (LAN) or combination], or the agency may determine that it mustacquire new hardware in addition to the new software. The system software environment includes theoperating system, database management system (DBMS), and system utilities (e.g., performancetuning software). The hardware environment partially determines the systems software environmentbecause most hardware currently on the market will only operate a limited number of brands of systemsoftware. However, the move toward open systems should reduce the extent to which these factorslimit software applications. The telecommunications environment consists of specialized hardware andsystems software designed to perform local and wide area networking functions. Many complex factorstypically go into determining the telecommunications environment. Detailed guidance for defining theagency's technical environment is beyond the scope of this Guide. However, factors agencies shouldconsider in selecting a technical environment include the ability of existing computing facilities tomeet future workload requirements, the cost of maintaining current facilities compared with acquiringnew ones, and whether information and processing will be centralized or distributed. Data conversionmay be required if the software the agency is acquiring will replace old software. The old software mayhave been used for some time, and the agency may have generated considerable data with it. (2)Preparing Plans The agency should prepare plans for its FIP resources projects according to thefollowing requirements: o The Paperwork Reduction Reauthorization Act of 1986 requires executiveagencies to develop and annually revise a five-year plan for meeting the agency's informationtechnology needs. The Federal Information Resources Management Regulation (FIRMR) providesguidelines for developing the plan. Each year, agencies must send a copy of the revised plan to GSA. oOffice of Management and Budget (OMB) Circular A-130 requires executive agencies to establishmultiyear strategic planning processes for acquiring and operating information technology that meetprogram and mission needs, reflect budget constraints, and form the basis for their budget request. oOMB Circular A-11 requires executive agencies to submit annual agencywide "Major InformationTechnology Acquisition Plans." (3) Budgeting The agency needs to include funding for any plannedsoftware acquisitions in its budget requests to OMB. The budget cycle begins two years before fundsbecome available. This long lead time emphasizes the importance of beginning planning and budgetingearly, long before the need for the new software becomes critical. The agency may not have identifieddetailed requirements for the software at the time of the initial budget submission. Therefore, theagency must make its best estimate of the likely cost. The IRM/technical personnel can assist withthese estimates. The agency can also talk to other agencies that have acquired similar software. (4)Deciding Whether to Continue Before defining the software requirements in detail, the agency needs todecide whether or not to continue the software project. Many organizations neglect this step becausethey get caught up in the momentum of the acquisition process. Agencies should make a "go/no-go"decision at this point based on a benefit-cost analysis. Do not confuse this with the comparative costanalysis described in Chapter 6, which is used to choose among technical and acquisition options, norwith the Independent Government Cost Estimate (IGCE) described in Chapter 10. It is beyond thescope of this Guide to explain in detail how to conduct this benefit-cost analysis. Guidance can befound in publications such as A Guide for Requirements Analysis and Analysis of Alternatives,another guide in this series; Federal Information Processing Standards Publication (FIPS PUB) 64,Guidelines for Documentation of Computer Programs and Automated Data Systems for the InitiationPhase; and OMB Circular A-94, Guidelines and Discount Rates for Benefit-Cost Analysis of FederalPrograms. However, keep the following considerations in mind while conducting the analysis: oInclude ALL costs for the project -- This includes not only the cost of the software, but also the cost ofthe following: oo Verification and validation oo Maintenance oo Training oo Documentation oo Userand operator support oo Data and application conversion oo Personnel costs for conducting the rest ofthe acquisition steps Include both one-time costs (e.g., conversion) and ongoing costs (e.g., usersupport). Take into account the tendency for services to become more expensive over time. o Considercosts and benefits to parties outside the agency -- For example, if the agency's new softwarecommunicates with other systems using a different file transfer protocol, other agencies that interactwith the new software may have to replace their own. o Consider both quantitative and qualitative

Page 36: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

benefits -- There are three categories of benefits: oo Benefits that can be expressed in monetary terms(e.g., salary savings from increases in productivity, prevention of monetary loss from new financialcontrol software) oo Benefits that can be expressed in other quantitative terms (e.g., a newcorrespondence control system permitting the agency to respond to inquiries 30 percent faster) ooBenefits that can be expressed in qualitative terms (e.g., decision support software that makes it easierfor managers to understand the implications of particular decisions, thus permitting them to makebetter ones) o Compare ALL benefits and ALL costs -- Do not compare just the monetary benefits withthe costs. This understates the total benefits of the project. If the total benefits appear to exceed thetotal costs under a reasonable range of assumptions, proceed with the acquisition. If the benefits do notexceed the costs, consider scaling back either the requirements or the scope of the acquisition (e.g.,only developing software for agency functions with the highest payoff).

5.5 TYPES OF REQUIREMENTS TO IDENTIFYSoftware requirements fall into two categories: o Functional and Performance -- Focus on what theusers and the agency need to accomplish when using the software and how the software must operate(e.g., transaction speed, accuracy of results) o Technical Environment -- Describe the FIP resourcesenvironment in which the software must operate These requirements are described below.

a. Software Functional and Performance RequirementsFunctional and performance requirements focus on what the software provides to support the missionneeds of the agency and the day-to- day activities of the users. They address not only the softwareitself, but also some of the services (e.g., training) that support its use. (1) Capabilities Capabilitiesdescribe the functions the software must support and the features it must offer. An example of afunctional capability requirement for financial software would be "must produce a report listingexpenditures for each office by month for a single fiscal year and totaling by month based on type ofexpenditure." Functional capability requirements should describe the processing the software willsupport (e.g., identifying overdue accounts), the nature of the data to be maintained (e.g., fixed-lengthcharacters or numbers, free-form text, digital images), and other characteristics of the activitysupported (e.g., required reports). (2) User Interface User interface defines both the degree to which theuser interacts with the software and how this interaction should occur. Agencies should identify theintended users and functions of the software and allow offerors to propose appropriate interfaces. Userscan control the software interactively (also referred to as on-line) or predefined commands stored in thesoftware can perform the necessary functions (also referred to as batch). Ease of use is a concern forinteractive software. For example, if personnel with little previous exposure to FIP resources use thesoftware, they may require a menu-based command structure or graphical user interface (GUI) so theycan use a device (e.g., a mouse) to point to commands or icons on the screen to execute functions.Other user interface requirements might deal with on- line, context-sensitive help and a mouse orpointing device in addition to a keyboard. IRM/technical personnel familiar with computer languagetypically control batch software. For batch software, ease of use may not be a primary concern. (3)Workload Workload is the amount of data and processing the software must support. A. Number ofUsers For minicomputer or mainframe software or software operating on a network, one measure ofworkload is the number of people and/or locations using it at the same time. B. ThroughputThroughput refers to the number of transactions (i.e., units of work) the software can process in a giventime (e.g., an hour). No standard definition for a transaction exists, but examples include entering arecord or querying a database. Agencies should specify how they define transactions. C. Data VolumeExamples of data volume include the number of records or images to be managed or reports to beproduced. (4) Speed An agency should define requirements describing how quickly the software mustcomplete specific operations. For example, a requirement for a database management system mightstate "must be able to retrieve a 300 byte record from a file of one million records, based on aStructured Query Language (SQL) query using two key fields, within 20 seconds." A requirement for adigital imaging system might state "must be able to retrieve a claims document image within 15seconds." (5) Growth Growth describes how the workload may change during the life of the software.

Page 37: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Express these requirements quantitatively, and document the assumptions underlying the estimates.Growth also addresses adding capabilities after the software is installed and/or taking advantage ofimprovements in technology. For these, the agency may wish to define requirements for technologyrefreshment (e.g., engineering changes and technology substitution). (6) Reliability Reliabilitydescribes how much error in the software code the agency can accept. Although reliability is usuallythought of in terms of FIP hardware (e.g., "downtime is no more than six hours during any calendarmonth"), software reliability requirements are also important since the result of unreliable software(i.e., downtime) is the same. Agencies typically state reliability requirements in terms of the number ofconsecutive hours or days without a system failure. (7) Availability Availability describes howaccessible the software must be to the users. An availability requirement may state, "the software mustbe available on a workstation located at each claim examiner's desk." (8) Accessibility by Individualswith Disabilities In 1986, Congress revised the Rehabilitation Act of 1973 to add a new section thatrequires that Federal electronic office equipment be accessible to individuals with disabilities.Guidelines on how best to meet the requirement were established in October 1987, based on input fromthe Government and industry accessibility experts. In 1988, the FIRMR incorporated the new statutoryrequirement and guidance. The most important factor in providing accessibility is awareness byprogram, IRM/technical, and contracting personnel of the requirements of disabled workers. GSAoperates a Clearinghouse on Computer Accommodation (COCA) that provides publications,demonstrations, and technical assistance. COCA has published a handbook, Managing InformationResources for Accessibility, that outlines how agencies can address accessibility in overall IRMplanning. In addition, the GSA's Council on Accessible Technology (COAT) has representatives from28 agencies that are committed to advancing accessible information environments. Some of the agencyconsiderations for ensuring accessibility include the following: o Retain capability for character-basedinterfaces -þ Both commercial and custom-developed software are making increasing use of GUIs,such as windows and icons. However, automated screen readers and other devices that aid blind userscannot currently read information presented in graphic formats. The agency should make sure thatdisabled individuals can bypass the GUI and operate the program with character-based interfaces. oObtain electronic versions of software documentation -þ If the software documentation (e.g., usermanuals, tutorials) is in electronic form, automated tools can convert the text to synthesized speech orbraille output for sight-impaired users. o Ensure that special terminate-and-stay-resident (TSR)programs supporting disability access are compatible with network software þ- Many of the specialautomated tools supporting access for disabled users operate in TSR mode. This means they are loadedinto a portion of random access memory (RAM) for use with any underlying applications software.Some LAN programs also use TSR modules for user access, but these may be incompatible with otherTSR programs. The agency should ensure that the network software is compatible with the TSRsoftware. Contact the COCA or the agency COAT representative for more information aboutaccessibility and the products that accommodate users when the standard monitor and keyboard areinadequate. (9) Control For some software, the agency will need to consider control requirements.Examples of control requirements for a multi-user database management system include: o A recordlocking capability so that two users cannot alter a record at the same time o Differing access tocapabilities for different classes of users o An audit trail of access and transactions (10) Security andPrivacy Security and privacy describe how the software must control access to data. Importantelements of security are requirements for user IDs, passwords, and maintaining an audit trail of usersaccessing the software. Other elements could include requirements for protection of sensitive andclassified data. The requirements should comply with the Computer Security Act of 1987. TheDepartment of Defense (DoD) has issued Trusted Computing System Evaluation Criteria, commonlyknown as The Orange Book, that describes levels of computer security. The National Institute ofStandards and Technology (NIST), OMB, and the FIRMR provide other security-related guidance. ThePrivacy Act of 1974 sets requirements for protection of data about individuals, such as personnelrecords or tax returns. The agency should identify any applicable privacy requirements. (11) TrainingTraining is vital if an agency is to benefit from new software. Training requirements might include thenumber of users to train, how the training should be delivered (e.g., classroom, computer- aidedinstruction), location (e.g., Government site, contractor site), and what subjects should be covered.Hands-on training, permitting users to key in commands and data, view the screen, and use the

Page 38: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

supporting equipment (e.g., printers), generally provides more benefits than lecture-only training.Agencies should also consider the need for continuing training. For example, novice users may absorbonly basic functions during initial sessions and require additional training to learn more advancedfunctions. In addition, all organizations experience some personnel turnover. New personnel may needtraining to use the software. (12) Conversion If the agency needs to convert existing data, it shoulddescribe the amount of data (e.g., number of records) to be converted, the automated environment inwhich the data currently reside, and the environment in which they will reside when converted. Theagency will have identified some of this information during the conversion study conducted during theplanning stage. (13) Documentation The agency will need written documents to support users (e.g.,explaining how to use each command, menu item, or icon), operators (e.g., describing data recoveryprocedures after a system crash), and programmers (e.g., explaining how to modify the software codeor update master tables). The agency should identify what documents it will need, as well as theirrequired features (e.g., glossaries, indices, sample screens, quick reference cards). (14) User andOperator Support These requirements describe necessary additional, non-written support. For example,users may need toll-free telephone support during regular business hours, or they may require bothtelephone and in-person assistance. Describe the nature and extent of the assistance required (e.g.,answering functional questions, such as "How do I produce a month-end closing report?," andtechnical questions, such as "How can I tune the database to improve performance?") and users' skilllevel(s). (15) Standards The agency should identify Governmentwide, agency, and industry standardsthe software should meet. Using standards ensures compatibility and integration with otherGovernment systems. However, the agency must justify the use of any restrictive standards (e.g., MS-DOS). The most extensive set of Governmentwide standards are the FIPS PUBS developed by NIST.FIPS PUBS include technical standards applying to software [e.g., FIPS PUB 151-1, PortableOperating System Interface for Computer Environments (POSIX)]. The agency may also requirecompliance with industry standards published by the American National Standards Institute (ANSI),Institute of Electrical and Electronics Engineering (IEEE), International Standards Organization (ISO),or other organizations.

b. Technical Environment RequirementsThese describe the FIP resources environment in which the software will operate. The agency definesthe technical environment in general terms during the planning phase. In the requirements analysis, theagency describes its detailed requirements. (1) Hardware Environment The agency must describe thehardware environment on which the software will operate. If the agency already has the hardware inplace, describe its characteristics (e.g., processor, amount of memory, magnetic and/or optical storage)and/or its brand, make, and model. If the agency plans to acquire the hardware through an acquisitionin parallel with the software development services, list the specifications for the hardware. (2) SystemSoftware Environment Describe the requirements for the system software environment. If the agencyalready has the system software in place, describe its characteristics (e.g., operating system,scheduling, management and control capabilities, DBMS record-locking features) and/or its brandname. If the agency plans to acquire the system software through an acquisition in parallel with thesoftware development services, list the specifications for the system software. (3) TelecommunicationsEnvironment Describe the requirements for the telecommunications environment. If the agency alreadyhas the telecommunications in place, describe its characteristics (e.g., LAN topology and protocols)and/or its brand name. If the agency plans to acquire the telecommunications through an acquisition inparallel with the software development services, list the specifications for the telecommunications. (4)Integration with Other Systems Describe requirements for integrating with other systems. These maybe within the agency (e.g., interface with the agency's central accounting system) or outside (e.g.,interface with a Federal Reserve bank). Many of these requirements will fall in the telecommunicationsarea, covering topics such as file transfer protocols, application program interfaces, and networkservices.

Page 39: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

5.6 DOCUMENTING THE REQUIREMENTS ANALYSISThe FIRMR requires that agencies prepare a requirements analysis document before issuing asolicitation. The document can also serve as a mechanism to achieve agreement among program,IRM/technical, and contracting personnel working on the acquisition. Finally, the requirementsanalysis is used in developing solicitation specifications or work statements. For all these reasons, theagency should prepare a comprehensive and clear requirements document before beginning the nextstep in the acquisition process, the analysis of alternatives. A GUIDE FOR ACQUIRING SOFTWAREDEVELOPMENT SERVICES

CHAPTER 6: ANALYSIS OF ALTERNATIVES

6.1 OBJECTIVEThe objective of an analysis of alternatives is to determine the acquisition method that is the mostadvantageous to the Government. Acquisition alternatives include two major components on whichmost of the agency's effort will be concentrated: the technical solution and the acquisition solution.Figure 6-1 illustrates the process. A successful acquisition requires matching the scale of the analyticaleffort to its scope, size, and criticality. According to the Office of Federal Procurement Policy (OFPP)Policy Letter 91-2, Service Contracting, the agency should be only as specific in a solicitation as itsrequirements dictate. This may mean stating only the software development services required ordefining specific service characteristics or delivery methods (e.g., on-site versus off-site). If the agencyonly identifies the software development services required, the analysis of technical alternatives willfocus primarily on matching the services with industry-recognized terms. For more specificrequirements, the analysis of technical alternatives must be more comprehensive, addressing not onlythe services required, but also any stipulations concerning them [e.g., tools or methods such as use ofspecific Computer Aided Software Engineering (CASE) tools]. This chapter describes the tasksrequired in both circumstances.

a. Analysis of Technical OptionsIn examining technical alternatives for software development services, an agency must recognize thatmany different technical approaches can provide the same service. Even when an agency describes itsneeds in functional terms, offerors will propose solutions made up of specific approaches andtechniques. Since some approaches might place unacceptable burdens on the agency (e.g.,Government-furnished space, longer response times), it should anticipate this and forestall them. Also,the agency may encounter problems in the acquisition process if it does not understand the technicalapproaches available in the marketplace. For example, the agency may inadvertently eliminatepromising solutions because the solicitation requirements or the source selection approach constraininnovative software development approaches (e.g., rapid prototyping). If the agency plans to identifyspecific software development methodologies or tools in the solicitation, it must examine what isavailable, decide on the acceptable range of approaches, and justify any limitations.

b. Analysis of Acquisition OptionsThe agency must also analyze its acquisition options. It can acquire software development serviceseither by contractual or noncontractual methods. Each method has advantages and disadvantages thatmust be considered in light of the agency's requirements and available resources. To make the best useof existing Government resources, the agency should identify and analyze feasible noncontractualmethods before contracting.

Page 40: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

6.2 CHOOSING BETWEEN ALREADY-AVAILABLE ANDCUSTOM-DEVELOPED SOFTWAREOverview of the Analysis of Alternatives Process Although this Guide addresses software developmentservices, using such services is only one of several options agencies have for meeting their softwarerequirements. These options include: o Mandatory-for-use programs o Mandatory-for-considerationprograms o Reengineering existing agency software o Using another agency's software o Commercialsoftware o Custom-developed software Since custom-developed software presents the greatest risk,agencies should consider the others first. They can be combined (e.g., customizing a commercialsoftware package), but this chapter will treat each as a separate option.

a. Mandatory-for-Use ProgramsAccording to the Federal Information Resources Management Regulation (FIRMR), the agency mustconsider GSA's mandatory-for- use programs. The FIRMR describes the GSA mandatory-for-useservices and programs and the conditions under which an agency may request exemption from them.These services and programs currently include: o FTS2000 Network Services o National Security andEmergency Preparedness (NSEP) o Financial Management Systems Software (FMSS) MultipleAwards Schedule (MAS) Contracts Program The FIRMR offers guidance about these and provides thelatest list of GSA-mandated programs and services. Not all of them provide software directly. Theagency must also consider mandatory agency sources, if any exist. If several offices perform a functionthat is similar or unique to an agency, it may mandate use of specific software. The agency'sInformation Resources Management (IRM) office can determine if a mandate-for-use covers thesoftware need identified in the requirements analysis. If it does, the acquisition team must assesswhether the mandatory-for-use software is appropriate. This assessment should compare the agency'sdesired functionality, throughput, and technical requirements with the features and capabilities of themandated software. If these are inadequate, agency regulation or guidance may require the acquisitionteam to provide written justification of the rationale for acquiring other software or services. Theagency may also have in place a requirements contract for software development services. Again, theagency's IRM office can determine if the scope of the contract can meet the requiring office's needs.

b. Mandatory-for-Consideration ProgramsFIRMR Bulletins C-2, Disposition and Reuse of Federal Information Processing (FIP) Equipment; C-12, Federal Software Exchange Program; and C-19, Information Systems Security, provide additionaldetails on using these programs. These programs currently include: o Consolidated LocalTelecommunications Service Program o Federal Software Exchange Program (FSEP) o Excess FIPEquipment Program o Federal Secure Telephone Service (FSTS) o Information Systems Security(INFOSEC) Not all these programs provide software directly.

c. Reengineering Existing Agency SoftwareReengineering existing software is another possible solution to the agency's needs. Reengineeringattempts to maximize the use of existing software by upgrading it with the latest technology. Toreengineer its software, an agency would have to modify its existing code to reshape it into a modernsystem. This can range from minor modifications to bring the existing software up-to-date (e.g.,improving only the user interface) to major redesign of internal functions. To determine ifreengineering is viable, the agency should assess how much of the existing software is salvageable.Major factors to consider include: o Scale of the difference between the software's current role andmission and its envisioned role and mission (e.g., expanding a transaction-based system into an ad hocreporting system) o Existence, quality, and accuracy of design and code documentation o Stability ofthe software's requirements since it was developed (i.e., if it is performing essentially the samefunctions as when first developed) o Quality of the design and coding techniques used to develop theoriginal software If this option seems promising after a high level assessment, the agency shouldperform a more detailed assessment. CASE tools now make it possible to recover more from old

Page 41: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

software than in the past (e.g., reconstructing the design from the source code when a design documentdoes not exist). Although these tools can provide a comprehensive assessment of what reengineeringwould require, they still require significant investment in funds and time to acquire, train for, and usethe tool. Before making this investment, the agency should consider whether the existing softwareprovides a good base of information for new software.

d. Using Another Agency's SoftwareThe agency can also consider using another agency's software either "as is" or customized. Someagencies establish interagency agreements to make their software and other FIP resources available toother agencies. The agreement might cover using the software on the host agency's computers orproviding copies for the borrowing agency. The lending agency usually provides the software "as is."If the borrowing agency requires customization, it will probably have to do it and then support themodified code. Also, unlike commercial software vendors, who provide technical support, newreleases, and upgrades, the lending agency may not offer technical support and is less likely to ensurethat the borrowing agency receives updates. Therefore, the agency should consider not only the matchwith its functional requirements and technical environment, but also its requirements for technicalsupport and control over software changes (e.g., deciding when modifications will be implemented).

e. Commercial SoftwareAnother Guide in this series, A Guide for Acquiring Commercial Software, covers this categorythoroughly. Commercial software is another option under already-existing software. The following aresome benefits of using commercial software: o It has been proven in the market place o It is generallymore thoroughly tested than custom software o Its vendors provide technical support and maintenanceo It is a stable application that can be implemented quickly The agency should compare itsfunctionality, throughput, and technical requirements with commercial software products. If these donot match exactly, the agency should consider either relaxing or foregoing some requirements orcustomizing a commercial software product. As the differences increase, the acceptability of thecommercial software solution decreases.

f. Custom-Developed SoftwareThe agency should also consider custom-developed software. While it may take longer to acquire,prove less stable, and require specialized training and support, properly developed custom software islikely to meet user requirements more exactly than the other options. After considering the six optionsabove, the agency needs to decide between acquiring already-available or custom-developed software.This decision is the first the agency makes during the analysis of alternatives.

6.3 KEY FACTORS FOR SUCCESSFUL ANALYSES OFALTERNATIVESSeveral key factors can help an agency select the best technical and acquisition options. Some dependon the specific software development services being acquired; however, the seven listed in Figure 6-2are important regardless of the range of services covered by the acquisition. These factors keep theanalysis objective and ensure that the results incorporate the key decision factors and relevant points ofview. Key Factors for Successful Analyses of Alternatives

a. Base the Analysis on the Agency's RequirementsUse the results of the agency's requirements analysis in the analyses of both technical and acquisitionalternatives. The requirements analysis identifies mandatory requirements. Any software developmentservices alternative not meeting them would be inadequate for the agency's needs. When beginning ananalysis of alternatives, the number of potential alternatives could be high, perhaps in the dozens. Oneway to identify the feasible alternatives for detailed analysis is to eliminate those that cannot meet oneor more mandatory requirement. For example, a mandatory requirement for software development

Page 42: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

services may be experience with and use of a technical environment similar to the agency's (i.e.,hardware, system software, database management system, development tools) and an acquisitionoption might be to acquire these services by using another agency's contract. (The other agency'scontract must permit such use.) However, if the second agency's contractor has no experience with atechnical environment similar to the first agency's, the acquisition option could not meet the mandatoryrequirement. Although the analysis of alternatives must be based on the requirements analysis, thoserequirements may change over time. Legislative, political, or economic factors may change theagency's needs or priorities. The agency might have to change a mandatory requirement based oninformation discovered in a market survey or cost analysis. These possibilities are discussed in moredetail later in this chapter.

b. Complete the Analysis Before Deciding on the SolutionCompleting the analysis before deciding on the best alternative is critical for determining the one thatis most advantageous to the Government. Often during an analysis, the team conducting it has aspecific approach in mind or finds that one has become the front- runner. Since acquisitions are oftendone within tight time constraints, the team may be tempted to stop looking when it finds anapparently acceptable approach. However, that alternative may not meet requirements from allperspectives (program, IRM/technical, and contractual), or another may be a better value. When atentative decision is reached, it frequently develops its own momentum. As a result, key issues may notbe seriously considered. For example, program personnel may decide to contract out all softwaredevelopment services without considering critical issues such as whether the sensitive or critical natureof some of the functions might introduce risk. Or an interagency agreement might be used because it iseasy to establish rather than because it will provide the most effective software development services.Typically, these shortcomings become evident after the agency makes the commitment to acquire theservices. The acquisition effort should take a comprehensive approach to the alternatives and examineissues important to each of its three audiences so that any tradeoffs are deliberate decisions and notforced by events or expediency.

c. Identify, Agree on, and Document Assumptions and ConstraintsIdentify and agree on assumptions and constraints used in the analysis and document them. Theseinclude: o The range of software development services the agency requires o The duration of theservice arrangement o The level of involvement and cost in agency personnel for acquiring,establishing, and administering the new software development services project(s) State theseassumptions explicitly. They can then be discussed, assessed, validated for consensus, anddocumented. Without consensus, serious miscommunication may occur among team members,affecting the analysis and, possibly, its results. Also document the constraints under which the analysisis conducted. These include factors that limit scope, depth, or complexity, including agency policies,funding, schedule, staff availability, and the current technical environment (i.e., the technology andtechniques the agency currently uses). Such constraints determine the feasibility of some options andmay significantly alter the options the team considers and the conclusions it reaches. Therefore,documenting the circumstances under which the acquisition team conducted the analysis is valuableand necessary, sometimes critical.

d. Use a Limited Set of Key Criteria to Distinguish among OptionsUsing a limited set of key criteria to distinguish among options helps focus the analysis on the mostimportant discriminating factors. As the number of criteria increases, the impact of each criteriondecreases, thus diffusing rather than focusing the analysis. The selection criteria should include nomore than six to eight factors that are not only the most important to the agency, but also willdistinguish among the alternatives. Researchers have found that people have difficulty processing morethan that number. If the team identifies more than eight criteria, it should re-examine them to see ifeach: o Represents the most important concerns of the agency o Represents different concerns and notdifferent aspects of the same concern o Distinguishes the alternatives from each other so that not all

Page 43: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

options meet the criteria to the same degree If all the options meet a criterion equally well, it is not agood criterion in spite of its importance to the agency. Usually, a re- examination will result in eitherthe consolidation of two or more criteria into broader ones or the elimination of some altogether.Remember these are selection criteria, not specifications. The team will write specifications when itwrites the solicitation.

e. Do Not Eliminate Options by Dictating a Specific ApproachSince requirements should be reassessed periodically, do not eliminate options by dictating specificallyhow the software development services should be delivered. Just as FIP technology changes rapidly,the nature of software development services can change rapidly as well. By not restricting servicedelivery approaches, an agency can take advantage of technical advances (e.g., using themicrocomputer as a development platform). Because many approaches can deliver the same service,the agency can be flexible and thereby gain other advantages (e.g., lower cost, higher level of service).Contractors will be more familiar with approaches they have developed and so be more effective.However, as these diverge from those the agency is familiar with, the difficulty of tracking contractorprogress and evaluating results will increase. For example, interim milestones might have differentnames or contain different information than the agency expects. The agency should balance thebenefits of innovative approaches against the comparative cost of monitoring their progress. Theagency should make such tradeoffs deliberately, considering relevant factors, rather than by accident ordefault. This allows the acquisition team to select the option that is most advantageous to theGovernment. The Analysis Process

f. Recognize that Requirements and Alternatives Evolve Over TimeAcquisitions would be simpler if the analysis process were linear, but in practice, it is iterative. Bothagency requirements and the alternatives available to meet them evolve over time as more informationbecomes known or circumstances change. As illustrated by Figure 6-3, each of the steps in the processcan be affected by changes in the others. Plan to revisit past decisions and documents to reflect thepractical realities of this evolution process, such as: o Advances in software development servicetechniques or practices provide new opportunities that agencies can and should exploit -- Theacquisition team may be unaware of new software development techniques or practices even thoughthese would be useful. For example, for years software was developed only on the technical platformon which it would be implemented. This increased the resources required and limited developmentoptions. However, advances in multi-platform CASE tools and portability standards have made itpossible to transfer software among environments and use less expensive resources (e.g.,microcomputers) as development platforms. Advancements in FIP resources occur too quickly to keepabreast of all new developments on an ongoing basis. Be willing to refine the requirements to includenew developments. o Changes in technical options or requirements may affect which acquisitionoption is best -- As the acquisition team refines technical requirements, it may find that options foracquiring the software development services change. For example, relaxing a requirement may allowadditional sources of supply to meet the agency's needs. On the other hand, a new requirement (e.g.,compatibility with other software) might eliminate some acquisition options that were available earlier.o Constant refinement and review are necessary to keep agency requirements synchronized with thealternatives analysis -- Each refinement in requirements has the potential to create a ripple effect in theother areas. For example, when a survey of the market causes a refinement in agency requirements, theteam must re- examine the analysis of alternatives to validate its results. o No single decision is finaluntil a comprehensive decision has been reached that satisfies all areas -- It must-- oo Be consistentwith what is available in the market, oo Be consistent with the requirements of the agency, and ooPresent an acceptable acquisition option.

g. Test the Sensitivity of the Analysis to Changes in Assumptions and ConstraintsTo avoid surprises, the acquisition team should test the sensitivity of the analysis to changes in keyassumptions. Usually, three or four of the assumptions documented at the beginning of the analysis are

Page 44: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

pivotal to its results. For example, suppose an agency assumes that Government facilities would beavailable to support software development services. What would be the effect if this were incorrect?Suppose an agency assumes that the contractor will use a certain approach to verify softwareperformance. What if a different approach is used? Sensitivity analysis allows the agency to assess therisk associated with a given alternative. Based on the sensitivity analysis, the agency may alter adecision resulting from the original analysis, prepare contingency plans, or both. For example, theagency may choose a slightly less attractive but less risky solution or develop contingency plans toreduce the possible impact if an assumption is incorrect.

6.4 ROLES AND RESPONSIBILITIESEveryone involved in the acquisition -- program, IRM/technical, and contracting personnel -- serves adistinct and crucial role in analyzing the alternatives. Throughout the acquisition process, these threegroups make closely related decisions that usually require tradeoffs among each area of expertise.Therefore, all three groups should be actively involved in the process from the very beginning toensure balanced decisions. For example, program office needs for rapid software completion andimplementation may be balanced against the cost and technical risk involved in requiring thecontractor to accelerate the schedule. After estimating the cost (i.e., resource commitment in time, staff,machines, and funds) and assessing the risk, the program staff may accept a longer schedule. Similarly,contract personnel can help determine if technical or program requirements limit competition.Tradeoffs should be made deliberately rather than by accident or by default. Only by collaboration canthe team ensure selection of the most advantageous technical options for the Government.

a. IRM/Technical PersonnelThe IRM/technical personnel perform a major role in analyses for software development services. Theyrepresent the technical viewpoint and must ensure that the technical decisions support program anduser requirements and are technically sound. They should be aware of the functional, performance, andcapacity implications of software development service options. If necessary, they should educateprogram and contract personnel about the technical and support implications of the available options.Their responsibilities include: o Providing oversight assistance in acquiring software developmentservices o Assisting in identifying available resources suitable to support the agency's softwaredevelopment services objectives o Assisting in conducting or directing the analysis of alternatives,including market surveys and related cost studies o Assisting in justifying other than full and opencompetition, if appropriate o Assisting in selecting the most advantageous alternative

b. Program PersonnelProgram personnel represent the end-user viewpoint and must ensure that the decisions reached willsupport the program and users' requirements. Program personnel must consider how the softwaredevelopment services will be used, who will be most affected, what level of user expertise one canexpect, who will be supported by the services, and where the personnel are located. Program personnelmust also address factors that may be specific to their program, such as security restrictions and controlrequirements. They should determine to what extent they would be willing to alter the way theyconduct their business to use the software development services. Their specific responsibilities include:o Identifying available program resources and constraints (i.e., staff, equipment, funds, and time) forthis analysis o Conducting or directing analyses, including market surveys, related cost studies, andanalysis of alternatives o Justifying other than full and open competition, if appropriate o Assisting inselecting the most advantageous alternative Examples of Information Sources for the SoftwareDevelopment Services Market Survey

c. Contracting PersonnelContracting personnel ensure that contract actions comply with relevant laws and regulations andproduce results in the Government's best interest. The Contracting Officer (CO), as head of theacquisition team, is responsible for seeing that an acquisition is conducted within applicable

Page 45: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

regulations, offers maximum competition, and pursues a course most advantageous to the Government.Contract personnel assigned to this analysis must keep these responsibilities in mind. They must makesure that each step of the process is well documented for regulatory compliance. Specificresponsibilities during analysis of alternatives include: o Identifying and assisting in evaluatingfeasible contracting alternatives o Assisting in conducting or directing the analysis of alternatives,including market surveys and related cost studies o Assisting in justifying other than full and opencompetition, if appropriate o Assisting in selecting the most advantageous alternative

6.5 SURVEYING THE MARKETThe market survey should provide the acquisition team with extensive knowledge of what softwaredevelopment services are available from both private industry and the Government. It also shouldprovide a basic understanding of software development approaches, techniques, tools, and serviceswith which to evaluate the technical merit of offerors' proposals. It is a cornerstone of any analysis ofalternatives. The survey should include all sources of information, such as industry and Governmentpublications, trade show literature, vendor literature, vendor presentations, and user comments orreferences. If the team knows relatively little about software development, it should includebackground materials in the survey to become familiar with industry terms and conventions. Themarket survey has four objectives: o To verify the technical feasibility of satisfying the agency'srequirements o To determine the amount of competition available to provide the software developmentservices required o To collect pricing information for cost estimates and comparative cost analyses oTo determine the industry norms and business practices for software development services Bothprogram and IRM/technical personnel should be directly involved in the survey. To properly comparethe software development services in the market against agency requirements, program andIRM/technical personnel must work from a common understanding of the approaches, technologies,tools, and services available, and the terms used to describe them. Program personnel cannot evaluatethe tradeoffs they will be asked to make if they do not understand the terms. Similarly, IRM/technicalpersonnel cannot advise the program personnel about technical tradeoffs if they are unfamiliar withwhat the market offers. Contracting personnel need to be familiar with software development servicesand technology, but to a lesser degree. Their primary concerns include ensuring that all reasonablesources have been considered and, when a contracting option is chosen, that sufficient competition isavailable to meet regulatory requirements.

a. Sources of InformationThe acquisition team can draw on a variety of sources for market information. As Figure 6-4 illustrates,the sources provide a wide range of information with varying degrees of objectivity. Vendors presenttheir software development approaches, tools, and services in the most favorable terms possible.Research organizations tend to provide more objective data, but may not provide it in the sametechnical detail as the vendors. Each source provides a slightly different kind of information. (1) TradePublications Trade publications carry vendor announcements, articles, or descriptions of service-oriented projects and often compare similar software development techniques and tools. Sometimesthey provide user surveys that compare and rank key characteristics of these techniques and tools and,occasionally, service providers. These publications can help identify key characteristics and helpeducate agency personnel about software development. However, except for user surveys, trademagazines frequently rely on vendors for information about their software development techniques andtools. This information, therefore, should be treated the same as vendor claims received from othersources (e.g., trade shows, vendor literature). (2) Vendor Marketing Literature Vendors producepublications and advertisements that describe software development functions, techniques, andsupporting automated tools to promote their products and related services. These generally emphasizethe advantages of their offerings over competing ones. Vendor literature is useful for identifying thebases on which vendors compete. For example, vendors might emphasize the ease of integration withan existing applications base, knowing this is important to their target market of potential customers.Agency personnel reviewing this literature may realize that this kind of integration could be an

Page 46: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

important issue for their agency, too. Knowing the basis on which vendors compete can help inrefining an agency's requirements. Promotional material presents the vendor's offerings in the mostfavorable light. It is unlikely to identify weaknesses. Also, comparisons with competitors' offerings areprobably skewed in favor of the vendor. Therefore, the claims made in these materials should bereviewed with this in mind. (3) Other Publications Some companies offer information services thatprovide independent research on FIP resources, as well as general reports on services, approaches,techniques, and tools an agency might include in a software development services acquisition. Theytypically describe each service, approach, technique, or tool in general terms, giving an approximatecost when available. The data provided by these companies are a mix of vendor-provided and usersurvey information. Information from vendors should be treated as vendor advertising. Theseinformation services also provide special reports about the software development industry that can helpthe acquisition team understand the services and tools available. For example, such a report woulddiscuss the common definitions of software development services and related terms, how softwaredevelopment services have evolved over time, and the range of current approaches (e.g., reengineerexisting software). It might also describe common pricing practices. These facts prepare the team fordiscussions with vendors and their customers. (4) Trade Shows and Exhibitions Trade shows andexhibitions also provide worthwhile sources for information about software development techniques,tools, and services. Their times and locations are usually published in trade magazine calendars. If acalendar is unavailable, contact vendors for the information. Although these shows provide excellentplaces to collect general information, the crowds may make it difficult to get a vendor's undividedattention. Also, the personnel in the booths are generally sales staff, not technical experts. As a result,they may not have answers to in-depth questions. In most cases, shows can illustrate how a techniqueor tool would be used and provide an opportunity to discuss services with vendors. Vendors cannotrealistically simulate software development conditions in an exhibition or trade show environment. (5)User Groups and Other Professional Associations User groups and professional associations provideopportunities to learn how effective specific tools and techniques are. Conversations with members ofthese groups can frequently provide information about the quality of a vendor's software developmentservices. For tools and techniques, users can provide information about long-term performance,multiple-user support, suitability, reliability, etc., that might be more realistic than a vendor wouldprovide. (6) Other Government Users Other Government users can provide insight into the suitabilityof a vendor's tools and techniques for software development services for the Federal market.Government agencies are subject to special restrictions that clients in the private sector do not face,such as mandatory Federal Information Processing Standards Publications (FIPS PUBS) and Office ofManagement and Budget (OMB) requirements concerning financial controls and computer security.Government users would know whether particular tools and techniques result in software that complieswith these requirements. (7) Request For Information (RFI) An RFI can also collect information aboutsoftware development techniques, tools, and service capabilities and availabilities. The RFI contains aset of requirements/needs or questions that loosely describes the agency's intent and asks the vendorcommunity to comment about the feasibility of these requirements and their approach to the problem.An agency typically uses an RFI if it is seeking an unusual or highly specialized service for which fewsources may be available. Agencies announce RFIs in the Commerce Business Daily (CBD), which hasa wide readership among vendors. In addition to gathering information, the RFI also notifies thevendor community that the agency is considering a software development services acquisition. Thisraises vendor awareness and can result in a better response to the solicitation when it is issued. Onedrawback to the RFI is that its issuance so early in the process provides industry little incentive todevote significant resources to respond. Unlike a Request for Comment (RFC), which usually containsdraft specifications, an RFI may be far removed in time, often over a year, from the issuance of thesolicitation and generally does not state specific requirements. Industry is not certain that a solicitationwill follow and the RFI requirements may be completely different from those in the solicitation. As aresult, vendors often respond with prepackaged marketing materials and cursory answers. While theseare useful, they may not provide much insight for the agency.

Page 47: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. What Information to GatherIn a market survey, the acquisition team should gather information in five areas: o Technical feasibilityo Service availability o Competition o Pricing o Industry support and contractual practices These areasdictate whether a software development services solution is viable and the kind of businessarrangements the Government can reasonably expect. (1) Technical Feasibility A market survey shoulddetermine the technical feasibility of the software development approach the agency anticipates. Forexample, the agency may anticipate developing the software in a technical environment different fromthe one on which it will be implemented (e.g., microcomputer development for a mainframe). Forsome applications (e.g., applications that interact heavily with the operating system), this approachmay be infeasible or ill-advised. (2) Service Availability A market survey should determine theavailability of the service to satisfy the agency's requirements. The initial statement of requirementsmay sometimes exceed capabilities currently available or advisable. For example, the number ofsoftware developers with experience in developing the type of application (e.g., payroll, propertymanagement) that the agency needs may be very limited, especially if the technologies involved arecomparatively new or the expertise highly specialized or agency-specific. A survey of availableservices can identify relevant services that are available, as well as new technical developments thatmay make new techniques possible. (3) Degree of Competition Available A market survey should alsodetermine the degree of competition available to satisfy the agency's requirements. The Competition inContracting Act (CICA) requires Government agencies to obtain full and open competition.Competition results in lower costs and more innovative solutions. If an agency specifies a requirementthat limits competition, it must provide justification to prove that another approach would not satisfythe requirement and that it is not in the Government's best interest to relax the requirement. The marketsurvey may reveal that a different or broader statement or a minor change to a requirement would allowmore offerors to respond and so provide more competition. This allows the Government to meet itsessential needs without giving up the benefits of competition. (4) Pricing Information A survey shouldcollect preliminary pricing information for both contractual and noncontractual sources (e.g.,interagency agreements). The information should include hourly prices by labor category, as well ascommon pricing practices used by vendors (e.g., offering additional services, such as training, at noadditional charge). Based on this information, the acquisition team can develop preliminaryGovernment cost estimates and conduct comparative cost analyses. (5) Industry Support andContractual Practices A survey should also produce information about common industry support andcontractual practices. These may vary according to the type of technologies and services required. Forexample, if the agency plans to support the software after development, personnel may require trainingin the development tools as well as in how the software functions. If the agency plans to have thesoftware development contractor support the completed system, less training will be required. Somecontractors may include a limited amount of training in their pricing while others may chargeseparately. Sources in the software development industry could provide insight into the techniquescommonly used to meet an organization's needs. Industry practices may differ in: o Pricing bases oLevels of user or technical support available o Technologies used o Site of service delivery The agencymust also be aware of the differences between individual company policies and industry-widepractices. Company policies can change if the circumstances warrant. However, industry practices thatevolve over time are based on collective experience and, usually, on sound business judgment. Thus,they are more difficult to change. They also provide worthwhile guidance about the way to approachthese issues. To this end, agencies should consider incorporating common industry practices into thesoftware development services requirements.

6.6 ANALYSIS OF TECHNICAL OPTIONSThe analysis of technical options should include the feasible solutions. The team should not select whatthey believe to be the optimal solution without undertaking a thorough analysis. Many differentapproaches can be used to solve the same problem. For the sake of competition and the Government'sbest interest, consider as many options as possible that do not place unacceptable burdens on theagency to implement, operate, or administer. For agencies considering software development services,

Page 48: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the technical decisions that need to be made vary by the service sought. However, the following fourmajor areas are typically addressed: o Performing software development services on-site or off- site oLimiting techniques and tools used o Limiting unilateral personnel replacement o Deciding the rangeof the software development services to include

a. Performing Software Development Services On-site or Off-siteMost software development services can be performed either on the Government site (on-site) or thecontractor's (off-site). If the Government has the available facilities, on-site performance may result inlower costs since a portion of the contractor's off-site rates recovers the cost of providing the facility.However, the agency must consider less obvious related costs. The facilities that the agency provideswould no longer be available for use without disrupting the contractor's work. Some of the corporateresources that the contractor might apply at its own site (e.g., technical experts for periodicconsultation, specialized facilities) may not be available nor as effective on-site. Some agencies mayhave no option. For example, an agency's facility might have restrictive security requirements thatpreclude on-site contractor performance.

b. Limiting the Techniques and Tools UsedThe agency should decide whether to limit the techniques or the tools used to perform the softwaredevelopment services. For example, the agency might require that the contractor use software toolscompatible with the agency's. Restrictions such as this may make the resulting software easier for theagency to maintain, but could also result in limiting competition. The agency should appropriatelyjustify any limitations it places on the techniques or tools.

c. Limiting Unilateral Personnel ReplacementThe agency may decide to limit the contractor's ability to unilaterally replace members of thedevelopment team. The skill and experience of the personnel performing the work often directly affectthe quality of software products. For some software development contracts, certain key positions maybe so important that changing personnel increases the agency's risk to an unacceptable level. Somecharacteristics of developing new software (e.g., an aggressive schedule, innovative technology ortechnical approach, a broad scope) make it relatively high risk, The following are examples of keypositions: o Project manager o Chief software design architect o Technology specialist(s) Replacingpersonnel in these positions might result in the loss of valuable knowledge or disrupt progress. In anegotiated contract, the agency could use key personnel clauses to prevent unilateral replacement ofpersonnel in these important positions. Chapter 10 discusses the use of key personnel clauses.Agencies should avoid designating all personnel positions as key. Instead, focus on those for whichskills, education, and experience make the most critical differences in performance. For example, theProject Manager should almost certainly be a key position. In addition, designate as key positionsthose requiring knowledge and experience that might be in short supply in the marketplace (e.g.,experience with a particular proprietary CASE tool). Do not designate as a key position one that willhave only minor involvement with the contract (e.g., contract supervisor, usually a corporateexecutive). In addition, junior personnel should not generally be designated as key since their level ofskill and experience is typically in greater supply and, thus, more readily interchangeable.

d. Deciding the Range of Software Development ServicesThe agency must also decide the range of the services to include in the acquisition. Although theacquisition team reached a preliminary decision in the requirements analysis, the agency should revisitthis decision during the analysis of technical options to confirm that the services originally identifiedare still mandatory. Should the agency scale back the solution because of cost or technicalconsiderations? Should it expand the range of services to include additional ones (e.g., extendedsupport, additional training) or start earlier in the life cycle (e.g., develop the preliminary design as wellas the detailed design)? To reap the benefits of a software development services acquisition and avoidcommon pitfalls, the agency should consider three factors concerning the range of services: o Define

Page 49: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the services included to maximize contractor effectiveness -- A significant benefit of a softwaredevelopment services acquisition is to take advantage of the contractor's expertise. If the agencydefines the range of services too narrowly (e.g., agency personnel still perform key elements of theprocess), this benefit may be lost. However, if the range of services is too broad or ill-defined,assessing contractor performance and maintaining contract control will be difficult. o Retain sufficientGovernment review points to avoid creating a possible conflict of interest -- During the softwaredevelopment process, the results of initial steps determine the level of effort for subsequent steps. Theagency should include review points to validate these results before the contractor proceeds. Thisavoids a potential conflict of interest on the contractor's part. Do not make the contractor solelyresponsible for both determining agency needs and meeting those needs without agency validation. oMaintain sufficient Government technical expertise and oversight to monitor contractor performance --If the contractor performs the software development services without agency personnel oversight, theagency may view its involvement as primarily administrative and overlook the need for IRM/technicalmanagement of the contract. Without IRM/technical personnel actively involved in monitoringcontractor performance, the agency may miss early signs of contractor problems (e.g., lack ofpersonnel expertise), allowing them to become worse. The agency should provide adequate in-housetechnical expertise and maintain the active involvement of its IRM/technical personnel with thecontractor to identify potential problems early and encourage prompt resolution. To ensure that thecontractor provides technical support and not personal services, the roles of both agency and contractorpersonnel should be distinct and well- documented. The agency should also specify that the contractorprovide deliverables at given intervals to document its role.

e. Choosing Among Technical OptionsTo choose among available technical options, determine which of them merit further consideration andwhat characteristics among the remaining ones distinguish them from one another. The agency's goal isto acquire high quality software development services at a reasonable price, but defining and ensuringquality for services is more elusive than for tangible FIP resources, such as equipment. The agencyshould focus on quality predictors (i.e., factors usually associated with high quality, such as well-qualified personnel) for software development services and how effectively it can control or monitorquality under each option. (1) Identifying the Options That Meet All Mandatory Requirements Basedon the results of the market survey and the technical analyses discussed above, reduce the initial set oftechnical options to those that meet the agency's mandatory requirements. If the first cut results ineliminating too many options (e.g., all but one), the team should revisit the requirements to see if somecan be changed. (2) Identifying the Key Discriminators Identify the key discriminators. Although theywill vary according to the agency's objectives, these are the characteristics that would cause the agencyto choose one option over another. They must be real discriminators, not just important requirements,because their function is to separate the options from each other. If they are important, but all theoptions meet them equally well, then they are not discriminators. The following are examples ofpossible key discriminators: o Support for program and user requirements o Availability o Effect oncompetition The agency should identify key discriminators based on the objectives of the acquisition.The options that perform best against these discriminators will be the preferred ones. (3) ComparisonsAfter identifying the key discriminators, compare the options against each discriminator in terms ofcost, benefit, and risk. The first two, cost and benefit, can be quantifiable monetary items, quantifiablenonmonetary items, and nonquantifiable items. Figure 6-5 provides examples of each type. Theyshould focus on the ability of the options to provide additional value beyond the mandatoryrequirements since all options must meet the minimum requirements. A. Costs Costs must be fullsystem life costs related to the option for that discriminating factor. Overall cost for the option isusually considered as a separate discriminator. Include both non-recurring (e.g., site preparation for on-site services) and recurring costs (e.g., services and supplies). Also look for hidden ones, such as thecost of agency contractor oversight. B. Benefits Benefits include both program and IRM benefits.Program benefits are improvements that the services will produce in an agency's effectiveness andability to complete its mission. IRM benefits are improvements in service delivery, such as increasedavailability or better service administration. Descriptions of benefits should be related to the agency's

Page 50: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

mission, goals, objectives, functions, and operating environment. Quantify benefits whenever possible.Quantifiable benefits generally carry more weight and strengthen the agency's position. Examples ofCosts and Benefits by Type C. Risk When considering risks, focus on those associated with the optionand the specific discriminator. Like cost, overall risk is usually considered as a separate discriminator.The risk analysis should consider the likelihood that benefits will not materialize or that costs willexceed those expected.

f. Documenting the Technical ChoiceBased on its analysis of the alternatives, the agency may choose one or a variety of acceptabletechnical options, leaving the specifics to offerors. After the team chooses the option(s), it mustdocument its choice. The technical options portion of the analysis of alternatives document describesthe options considered, the methodology used, the assumptions and constraints applied, the rationalefor selecting the key discriminators, an analysis of the options against the discriminators, therecommended option(s), and a sensitivity analysis for the key assumptions. At the same time the teamdocuments the analysis of technical options, it should also document refinements to the agency'srequirements in an addendum to the requirements analysis. Since the subsequent acquisition steps willbe based on these refined requirements, they should be documented and kept with the originalrequirements. This will help defend the option selection process if that becomes necessary.

6.7 ANALYSIS OF ACQUISITION OPTIONSIn addition to the technical options, the acquisition team must consider acquisition options, includingboth noncontractual and contractual approaches to acquiring the software development services.

a. Noncontractual OptionsTo make the most effective use of existing Government resources, the agency should considernoncontractual options first. They include the following: (1) Do Nothing The acquisition team shouldbe prepared to do nothing further if the analysis shows the costs of providing the software developmentservices exceed the benefits. Under this option, the agency might determine it is more advantageous toforego the services or continue current services, if they exist. (2) Reassign Software DevelopmentServices Resources within the Agency Parts of the agency with in-house software developmentpersonnel could reassign them to support another office's requirements. Often, however, the requiringoffice has needs too extensive for this kind of arrangement. (3) Reschedule Software DevelopmentServices Existing software development services might be shared with other offices by reschedulingactivities to accommodate each group within the agency and also spread the cost among them. Thiswould be feasible only if the software development services required a predictable amount of time tocomplete and could be completed within an acceptable timeframe for each group. (4) Share SoftwareDevelopment Services with Another Agency An agency with its own software development servicesprogram may be willing to provide limited services if it can accommodate the additional workload andif technical environments are similar. If an existing contract can provide the shared softwaredevelopment services, the scope of the contract must allow for use by other agencies. In practice,software development services are difficult to acquire through noncontractual options because theseoptions assume a regular service schedule that can be rearranged to accommodate the agency'srequirements. Agencies generally acquire software development services as needed for a specificsystem, not continuously. Important factors to consider for noncontractual options are: o Long-termviability -- Often noncontractual options can provide short-term relief, but not the resources requiredfor the long term. One agency is not obligated to continue to provide software development services tomeet another agency's requirements. Therefore, an agency currently providing these services may notcontinue doing so. If long-term viability is a concern, consider whether the resources spent on pursuingthe noncontractual option might be better spent pursuing a long-term contractual solution. o Adequaterange of services available -- Before sharing software development services or acquiring them fromnoncontractual sources, determine that the option provides all required services and resources. Servicesacquired through noncontractual sources are generally on an "as is" basis. If an agency requires special

Page 51: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

services, it must be prepared to arrange for these separately. o Control and priority over resources --Typically, when an agency agrees to share its resources, it retains control of them and the right toestablish priorities for their use. If the software development service is critical to the borrowing agency,its lower priority may prove unacceptable.

b. Contractual OptionsIf the results of the analysis eliminate noncontractual options, the following four contractual optionsare available: o Small purchase procedures o Established sources of supply [e.g., Federal InformationSystems Support Program (FISSP)] o Sealed bidding o Contracting by negotiation Most softwaredevelopment services acquired by contract are acquired either by a negotiated acquisition orestablished sources of supply. Requirements usually cannot be defined precisely enough for the sealedbidding method. Furthermore, the size and scope of software development services acquisitionsnormally preclude the use of small purchases.

c. Choosing Among the Acquisition OptionsAfter the feasible acquisition options have been identified and examined, choose the option mostadvantageous to the Government. To choose the best contractual option, follow the guidance providedin the Federal Acquisition Regulation (FAR) and the FIRMR.

d. Document the Analysis of Acquisition OptionsThe team should document and explain the acquisition option chosen, including: o Noncontractualoptions considered o Contractual options considered o Rationale for selected option(s)

6.8 DOCUMENTING THE ACQUISITION STRATEGYDocument the acquisition strategy carefully and completely. It provides the rationale for all majordecisions about the acquisition. A well-documented strategy helps maintain internal momentum if thecomposition of the acquisition team changes. The agency can also use the documentation to defend itsdecisions to oversight committees and agencies and against protests. FAR Part 7 describesrequirements for agency acquisition plans. FIRMR Bulletin C-5 describes the documentation necessaryto support an Agency Procurement Request (APR) submission to GSA to secure a Delegation ofProcurement Authority (DPA), if required. Documentation to support the acquisition may include: oRequirements Analysis o Analysis of Alternatives (including a comparative cost analysis) oJustification for Limiting Competition (if appropriate) During oversight agency reviews (e.g., a GeneralAccounting Office (GAO) audit, an OMB review, or GSA consideration of an APR), any or all of thesedocuments may be requested. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENTSERVICES

CHAPTER 7: COMPETITION REQUIREMENTSOne of the basic objectives when acquiring Federal information processing (FIP) resources throughcontracting is to ensure that the Government realizes all the benefits of full and open competition. Theterm full and open competition means that all responsible sources may compete.

7.1 CONTRACTING UNDER FULL AND OPEN COMPETITIONFull and open competition is not just the law. It makes good sense. The competitive market serves thebest interests of Government agencies, the taxpayers, and the vendor community. The Competition inContracting Act (CICA) establishes requirements for full and open competition. The FederalAcquisition Regulation (FAR) and the Federal Information Resource Management Regulation(FIRMR) implement these requirements. Agencies are in violation of CICA, FAR Part 6, and theFIRMR if they do not provide for full and open competition, provide for full and open competitionafter exclusion of sources, or justify the use of other than full and open competition. Unless one or

Page 52: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

more of the specific conditions exist that allow other than full and open competition, the agency mustprovide for full and open competition. Program, Information Resources Management (IRM)/technical,and contracting staff all have responsibility for ensuring that every potential qualified source ofsoftware development services that can meet the requirements is sought, considered, and encouraged toparticipate. The acquisition team needs to examine carefully all documents containing requirements todetermine whether they restrict competition and, if so, remove unnecessary restrictions.

7.2 DEGREES OF COMPETITIONThe FAR defines three degrees of competition.

a. Full and Open CompetitionFull and open competition should result from using a competitive procedure or a combination ofcompetitive procedures best suited to the circumstances of the individual acquisition. In this process,the Contracting Officer (CO) is responsible for using good judgment in selecting the competitiveprocedure that "best meets the needs of the Government" (FAR Part 6).

b. Full and Open Competition After Exclusion of SourcesUnder certain circumstances, an agency may exclude a source or sources from participating in anacquisition. These circumstances fall under the following categories: o Establishing or maintainingalternative sources o Set-asides for small business and labor surplus area concerns o Small BusinessAct, Section 8(a) competition [Sole source 8(a) awards are considered to be awarded under other thanfull and open competition]

c. Other Than Full and Open CompetitionUnder limited conditions, contracting for software development services without providing for full andopen competition may be allowed. FAR Part 6 and the FIRMR contain the policies and procedures andidentify the statutory authorities for this type of acquisition. (1) Circumstances Permitting Other ThanFull and Open Competition Circumstances that permit awarding a contract without full and opencompetition are: o The required supplies or services are available from only one responsible source,and no other supplies or services will satisfy the agency's requirements. o Unusual and compellingurgency exists and the Government would be seriously injured if the number of sources were notlimited. o The agency needs-- oo To maintain a supplier available for furnishing supplies or services incase of a national emergency or to achieve industrial mobilization oo To establish or maintain anessential engineering, research, or development capability provided by an educational or othernonprofit institution or a Federally funded research and development center o An internationalagreement, treaty, or written direction exists under which the U.S. Government will be reimbursed forthe purchase of supplies and services. o A statute expressly authorizes or requires the acquisition bemade through another agency or from a specified source, or the agency needs a brand namecommercial item for authorized resale. o Disclosing the Government's needs would compromise thenational security. o An agency head determines that it is in the public interest to limit competition. Forthis, a written notice must be submitted to Congress. (2) Justification for Other Than Full and OpenCompetition Justification for other than full and open competition must cite the specific requirementsdefined in the requirements and alternatives analyses. The CO must prepare the justification and obtainthe necessary agency approvals. In accordance with FAR Part 6 and the FIRMR, program andIRM/technical personnel must certify the completeness and accuracy of the justification. Thejustification for other than full and open competition must contain the rationale and sufficient facts towarrant the use of the specific FAR or FIRMR authority cited and must be approved in writing. FARPart 6 establishes levels of approval (i.e., CO, agency competition advocate, agency senior acquisitionexecutive) based upon the proposed contract dollar value and other factors.

Page 53: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

7.3 COMPETITION ADVOCATESFAR Part 6 establishes competition advocates for each agency and for each acquisition activity of theagency, as required by CICA. They are responsible for questioning barriers to full and opencompetition. The acquisition team must notify the competition advocate of a pending acquisition inaccordance with the agency's policies and procedures. Competition advocates report the following tothe agency's senior acquisition executive: o Opportunities and actions taken to achieve full and opencompetition o Conditions or actions that unnecessarily restrict competition o Annual reports on: ooCompetition advocate activities oo New initiatives required to increase competition oo Any barriers tofull and open competition oo Recommendations of goals and plans to foster competition ooRecommendations to increase organizational responsibility for competition A GUIDE FORACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 8: REVIEW, APPROVAL, ANDAUTHORIZATION PROCESSSome acquisitions for software development services require an agency-level review and approval and,possibly, GSA authorization. The need for GSA authorization depends on the acquisition's dollar valueand the agency's regulatory or specific agency delegation of procurement authority (DPA). However,many software development services acquisitions are within an agency's authority and will not requirea specific GSA authorization. Typical Agency-Level Reviews

8.1 THRESHOLD LEVELSThe Federal Information Resources Management Regulation (FIRMR) delegates a prescribed level ofauthority to all agencies for acquiring Federal information processing (FIP) resources. This authority isreferred to as a regulatory DPA. Agencies may also receive specific agency delegations from GSA forhigher or lower dollar limits. Both types of delegations allow agencies to acquire software developmentservices up to a specific dollar limit without GSA's prior approval. Above that limit, agencies mustobtain a DPA from GSA for a specific acquisition. At the time of this printing, an agency can acquiresoftware development services under a regulatory delegation when the dollar value of the services overthe life of the contract does not exceed $2,500,000 ($250,000 for services available from only oneresponsible source). Check the FIRMR for current regulatory delegations.

8.2 AGENCY-LEVEL REVIEW AND APPROVALEach agency has a Designated Senior Official (DSO) who is responsible for carrying out the agency'sInformation Resources Management (IRM) functions. This official is assigned responsibility for theconduct of and accountability for acquisitions of FIP resources. The DSO is responsible for reviewinginternal documentation and submitting agency procurement requests (APRs) to GSA when required.For many software development services acquisitions, the DSO delegates these responsibilities. Checkwith the GSA Acquisition Reviews Division (KMA) for identification of the agency's DSO and/ordesignees. Acquisition review and approval policies and procedures for software development servicesvary from agency to agency. In addition, the procedures and levels of review and approval at a specificagency may vary according to the size and scope of the software development services acquisition. Anagency review and approval process normally occurs during the presolicitation, evaluation, and pre-award phases of the acquisition. Figure 8-1 summarizes, for a typical agency, the types ofdocumentation, determinations, and guidance needed and the personnel usually responsible for theirpreparation and review during these phases. Even though the specific policies and procedures forconducting internal reviews and approvals may vary by agency, all acquisitions must comply withFederal regulations. This compliance dictates the types of analyses conducted and the documentationreviewed and approved by the agencies (and GSA, if applicable). A software development servicesacquisition will typically include the review and approval of the following documents: o RequirementsAnalysis -- Process for determining and documenting the agency's need for the software development

Page 54: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

services. o Analysis of Alternatives -- Process of investigating and considering alternatives forsatisfying software development services requirements. The purpose of the analysis is to determinewhich technical and acquisition options will best meet the Government's needs. o Conversion Study --A study to determine the impact on resources (i.e., cost and time) of relocating software from onehardware/operating system environment to another. o Performance Evaluation for the CurrentlyInstalled FIP System -- Provides a baseline for evaluating proposed alternatives to meet data processingneeds. When applicable, the following documentation may also be required: o Certified Data toSupport a Requirement Available from Only One Responsible Source -- Provides justification for otherthan full and open competition. o Determination to Support Compatibility-Limited Requirements --Provides justification for compatibility limitations for developmental or custom software, such as theneed to use a specific operating system. o Description of Planned Actions to Foster Competition forSubsequent Acquisitions -- Describes actions to be taken to foster full and open competition.

8.3 GSA REVIEW AND AUTHORIZATIONIf the estimated cost of the software development services exceeds the agency's regulatory or specificagency DPA, the agency must request an acquisition-specific DPA. To obtain this, the DSO ordesignee must submit an APR to GSA. The FIRMR gives specific APR guidance (see FIRMR BulletinC-5, Delegation of GSA's Exclusive Procurement and Multiyear Contracting Authority). The DPAissued will be limited to the resources described in the APR. GSA reviews the APR for compliancewith regulations and policies that address overly restrictive specifications, evaluation factors, solesource justifications, splitting requirements, excessive contract length, and other areas of concern. GSAhas 20 working days (plus 5 days for mail) to process the APR. Agencies can assume the DPA hasbeen granted after the 25-day period if GSA has not requested additional information. If GSA requestsadditional information, the 20-day clock starts again when GSA receives it. After review of the APRand any additional documentation submitted, GSA may-- o Delegate to the agency the authority toconduct the contracting action, o Delegate to the agency the authority to conduct the contracting actionwith GSA participation to the extent GSA considers necessary under the circumstances, o Conduct thecontracting action itself or otherwise obtain the requirement on behalf of the agency, or o Restrictauthority until all conditions identified by GSA have been met. Check the FIRMR and its proceduralbulletins for policies and guidance. Submit questions about preparing an APR to the GSA KMA. AGUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 9: GENERAL CONTRACTINGCONSIDERATIONSThe success of acquiring software development services by contracting depends on five key areas: (1)sound acquisition planning and execution; (2) adherence to applicable Government regulations,policies, and guidelines; (3) proper assignment and understanding of roles and responsibilities by allindividuals involved in the acquisition process; (4) identification of appropriate requirements for thedevelopment contractor; and (5) selection of the proper contract type.

9.1 SOUND ACQUISITION PLANNING AND EXECUTIONSuccessful selection of a contractor that best meets the needs of the agency depends upon developingand executing a sound acquisition plan. If the plan is faulty, the best offeror(s) may not be selected, theavailability or effectiveness of the services may be reduced, or some other circumstance may result thatis not in the Government's best interest. Agencies should adhere to the general acquisition planningand execution considerations listed below regardless of the type of acquisition.

a. Match the Level of Effort to the Size and Scope of the AcquisitionBase the level of planning, resources, and detail of the various analyses on the size, scope, andcomplexity of the acquisition. In developing an acquisition plan, relate the level of effort to the

Page 55: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

estimated cost and complexity of the software development services to be acquired. Unfortunately, nohard and fast rules exist for determining the level of effort required. Each acquisition has its ownprofile. However, the level of effort and resources required for an acquisition will always increase as itscomplexity increases.

b. Maximize CompetitionThe Government requires full and open competition whenever possible. Through this process, theGovernment obtains competitive prices for the services it requires at terms and conditions that are in itsbest interest. Competition also allows consideration of more innovative and effective solutions.Agencies limiting competition may face problems such as: o An increased potential for protest bypotential offerors who cannot compete due to unnecessarily restrictive requirements o GSA disapprovalof the acquisition based on unnecessary constraints on competition o Office of Management andBudget (OMB) or Congressional intervention o Acquisition of services that are not the mostadvantageous to the Government The acquisition team can maximize competition and reduce thepotential for these difficulties by ensuring that the translation of requirements into solicitationspecifications does not unnecessarily reduce the potential number of qualified offerors. Too detailedspecifications, derived from the requirements analysis and analysis of alternatives, can potentially limitcompetition. If an offeror cannot meet every specification the agency establishes, the offer can beeliminated from further consideration. For example, if the agency specifies that a software developmentcontractor use a specific tool for developing graphical user interface (GUI) screens, offerors proposingother GUI development tools would be eliminated from the competition as noncompliant, even if theother tools could meet the agency's needs equally well. The agency should also develop fair proposalevaluation factors. Such factors, not only make it easier to select the offer that is most advantageous tothe Government, but also provide a fair and equal opportunity for competition. Some common pitfallsto avoid include: o Establishing, without justification, factors that can be met only by a limited groupof offerors (e.g., experience with a proprietary software development methodology) o Ranking factorsin favor of one or two potential offerors o Borrowing factors from another acquisition because it wassuccessful rather than because it reflects the agency's needs

c. Develop Clear Specifications and Work StatementsBy stating its needs clearly in the solicitation, the agency: o Maximizes its opportunity of getting thesoftware with the capabilities and features it needs o Makes it easier for offerors to prepare theirproposals o Shortens the time needed to evaluate proposals o Eases the administration of the resultingsoftware development services contract To develop clear specifications and work statements: o Simplyand accurately state what the contractor must provide -- This reduces the potential for misinterpretingthe requirements and, as a result, the need for offerors to clarify their proposals. o Specify requirementsprecisely -- Avoid subjective terms that cannot be accurately measured, defined, or identified by eitherthe Government or the offeror. Examples include "where feasible," "an unacceptable level ofperformance," "an appropriate response time." o Distinguish between Government and contractorresponsibilities -- For example, always use "shall" when referring to an activity the contractor must doand "will" when referring to something the Government will provide or do. Establishing thisconvention removes any doubt about what the Government requires. It alerts the offeror to address allstatements containing the word "shall" or "must" to be considered responsive to the solicitation. Theofferor will also know the Government's obligation or right to do whatever is in a statement containing"will." o Use terms "and" and "or" properly -- "And" is inclusive. It means "plus" (i.e., a, b, and cmeans all three). "Or" permits an alternative, a substitute (i.e., a, b, or c means any of the three). oAvoid the use of industry or advertising catch words to describe features or characteristics -- Industryjargon, such as "fully integrated" or "user friendly," tends to describe features too broadly for use in thesolicitation. To avoid misinterpretation and possible disputes, define precisely the terms that describethe requirements. For example, "user friendly" should be replaced with the specific characteristics,such as "a consistent command set" or "a context-sensitive help function," that would make thesoftware user friendly. o State each requirement only once -- This eliminates the possibility ofcontradictory requirements in the solicitation. For example, establishing an acceptance test period of 12

Page 56: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

weeks in one section of the solicitation and 15 weeks in another is contradictory. To eliminate this,state the requirement in one place and cross-reference it in the rest of the solicitation. o Define allterminology -- To prevent confusion, define all terms that might be interpreted in more than one way(e.g., business day, holiday). Use words and terms in accordance with their general definitions. Avoidcoining new terms or defining established terms in a unique way. Consider developing acomprehensive glossary with exact definitions. It reduces the potential for misinterpretingrequirements.

d. Avoid Providing Information to Potential Offerors Unless AuthorizedBy law, all potential offerors must have equal access to available Government information. Givinginformation to one or all potential offerors unless authorized to do so by the Contracting Officer (CO)can result in cancellation of the acquisition. It can also result in the suspension or dismissal of thoseresponsible. In some circumstances, it may be a criminal offense or result in severe civil penalties. Anyinformation that might be sensitive should be discussed with and approved by the CO before it is givento potential offerors. If approved for release, the decision should be documented in the contract file.Interacting and exchanging information with potential offerors may be in the Government's bestinterest at various stages in the acquisition process (e.g., during market surveys). However, even at thisstage, every potential offeror should receive the same information. When the survey stage ends anddevelopment of specifications or work statements begins, contact with potential offerors should stopfor everyone except the CO or the CO's designated representatives. Protecting sensitive informationfrom inadvertent release to a potential offeror is especially difficult when a current contractor is apotential offeror. In this instance, Government personnel need to take extra care to avoid inadvertentlygiving information to the incumbent contractor that is not provided to all potential offerors. Measuresto consider include: o Performing all acquisition-related work in an area physically separated andsecure from the contractor's personnel o Ensuring all releasable information available to the contractoris also available to every potential offeror o Limiting the number of Government personnel involved inthe acquisition process who must interface with the contractor's personnel

e. Document and Follow the Source Selection ProcessThe Source Selection Plan (SSP) guides the process for negotiated acquisitions. It details the rules andprocedures the agency will use to select the winning contractor. It serves as the charter for and overalldirection to the agency personnel who will evaluate proposals. A clear, comprehensive, well-documented SSP is essential to selecting the offer that is most advantageous to the Government. It willhelp the acquisition team avoid many common problems encountered in the source selection process(e.g., protests, problems in evaluating bids and proposals). The SSP includes descriptions of thefollowing: o Roles, responsibilities, and organization of the Source Selection Team (the contracting,Information Resources Management (IRM)/technical, user personnel, and advisors involved in thesource selection process) o Proposed presolicitation activities. These might include issuing a draftsolicitation or convening a conference either before or after issuance of the solicitation and before thedue date for proposals. o A summary of the acquisition strategy o A statement of proposed evaluationfactors, significant subfactors, and their relative importance o A description of the evaluation process,methodology, and techniques o A schedule of significant milestones Adhering to the solicitation andthe SSP is critical to selecting the most advantageous offer. This not only increases the likelihood of afair and equal evaluation of all offers received, it also protects the agency from potential charges ofbias during the evaluation process. If the plan is not followed, an offeror may claim that a deviationwas detrimental to its bid or proposal and subsequently file a protest.

f. Involve Each Discipline throughout the Acquisition ProcessAgencies should use the experience and expertise of program, IRM/technical, and contractingpersonnel throughout the acquisition process. Each group has the ability to make a unique contributionto a successful acquisition. In addition, the interactions among diverse disciplines should result in theselection of higher quality software development services. Without this interaction, the services

Page 57: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

acquired may produce software that does not fully meet the agency's documented requirements norconform to the contract. The result may not be in the Government's best interest. Involving all threedisciplines throughout the process usually assures that the acquisition team members will have greatermotivation to assume responsibility for acquiring and implementing the software developmentservices.

g. Actively Seek Parallel Reviews and ApprovalsPerform as many of the required agency reviews and approvals in parallel as possible. Parallel reviewis a technique that brings together the principals and materials needed to make the source selectiondecisions. Identified in the GSA Go-For-12 program, it can shorten and enhance the acquisitionprocess. Go-For-12 is designed to shorten the acquisition process and improve its effectiveness.Information on this program is available from GSA's Agency Liaison Officer Program Division (KAL).

h. Document the Bases for DecisionsThe acquisition team should document the basis for every decision both for historical purposes and tosupport review by agency officials or oversight organizations. Many decisions about the developmentof the solicitation and the conduct of the acquisition are made throughout the source selection process.Changes in circumstances, such as changes in the technical environment or the available funding, maycause the team to revisit its decisions. The acquisition team should document its decisions at a level ofdetail appropriate to the size, scope, and mission criticality of the required software. For example, anacquisition for software development services for a training database at a small regional office shouldnot require as much documentation as an acquisition for software development services for a large,agencywide system supporting the agency's primary mission. In the latter case, documentation of thedecisions and rationale for the evaluation results should include substantially more detail.

i. Assign a Team with a Level of Expertise Equal to the Size and Scope of theAcquisitionThe acquisition team must have the necessary expertise in each discipline (i.e., program,IRM/technical, and contracting) to conduct the acquisition successfully. Therefore, the team's level ofexpertise must match the size and scope of the acquisition. When the software to be developed cansignificantly affect an agency's ability to perform its mission effectively, the agency should assignsenior and experienced personnel from each discipline to the acquisition team. Conversely, acquiringsoftware development services for a noncritical system may not require that same level of expertise.Individuals in the agency with the necessary skills and expertise to support the acquisition shouldparticipate. When they are not available, consider obtaining the required expertise from otherGovernment or industry sources.

9.2 APPLICABLE REGULATIONS AND GUIDANCEAgencies should follow applicable policies, procedures, and guidelines throughout the acquisitionprocess no matter which contracting method is used. These include the Federal Acquisition Regulation(FAR), the Federal Information Resources Management Regulation (FIRMR), Office of FederalProcurement Policy (OFPP) letters, and GSA guidelines. Internal agency policies and procedures mustalso be followed. These might address: o Acquisition policies and procedures o Agency-specificstandards required in the solicitation, such as document standards, agency standard softwaredevelopment methodologies, or tools o Approval levels Members of the acquisition team must becomefamiliar with both Governmentwide and agency-specific policies and procedures and adhere to them asapplicable. In addition, the General Services Board of Contract Appeals (GSBCA) and the GeneralAccounting Office (GAO) rule on protests made by offerors. These protest decisions are based oninterpretation of law, Government policy, and regulations. The acquisition team (especially the CO)should review these rulings to ensure that the solicitation, SSP, and source selection process areconsistent with the rulings. One source of information about recent GSBCA rulings is the ADP Protest

Page 58: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Report published quarterly by the GSA Information Resources Management Service (IRMS). Forfurther information, contact GSA's Acquisition Reviews Division (KMA). The agency's GeneralCounsel can also provide information about protest rulings.

9.3 ROLES AND RESPONSIBILITIESEach discipline on the acquisition team has primary and complementary roles and responsibilities thatcontribute to the successful acquisition and implementation of software development services.

a. Program PersonnelProgram personnel ensure that the acquisition meets the mission and functional needs of theorganization. In performing this role, program personnel: o Participate in the development of thespecifications and work statements included in the solicitation to make certain they meet programrequirements o Participate in the development of the source selection methodology o Participate in thedevelopment of the evaluation factors and subfactors used to select the contractor o Participate in thesource selection process that results in identifying the recommended contractor o Support the CO innegotiations with the offerors (as requested)

b. IRM/Technical PersonnelIn acquisitions for software development services, IRM/technical personnel ensure that the softwaremeets all technical requirements. In this role, IRM/technical personnel participate in many of the sameactivities as the program personnel. They participate in developing the solicitation specifications, workstatements, and the evaluation factors and subfactors, as well as in evaluating bids and proposals.However, they provide Federal information processing (FIP) technical expertise as opposed tofunctional expertise provided by the program personnel.

c. Contracting PersonnelContracting personnel ensure that the acquisition meets the policies and procedures of the FAR,FIRMR, and agency. The CO acts as the official interface with offerors for all acquisitions. The CO isthe only individual with the authority to negotiate on behalf of and commit the Government to contractawards or modifications. The CO: o Ensures that all necessary agency and GSA approvals are obtainedo Ensures that the source selection process is conducted in compliance with Federal and agencyregulations and procedures o Ensures that the source selection process is conducted as specified in thesolicitation o Awards the contract following selection of the contractor

9.4 IDENTIFYING REQUIREMENTS FOR THE SOFTWAREDEVELOPMENT CONTRACTORIn addition to requirements for the software to be developed (the subject of Chapter 5, RequirementsAnalysis), the agency must include in the solicitation requirements for the development services thecontractor must provide. Each of these areas is discussed below in general terms. A Guide forEvaluating Proposals and Bids, another guide in this series, also discusses the distinction betweenrequirements for specific FIP resources and requirements for the contractor that will provide them.

a. Specific Development ServicesThe agency should identify the software development services the contractor should provide,considering the entire development life- cycle. In addition, if the agency decides to acquire relatedservices, such as training, conversion, software maintenance, and user support as part of theacquisition, it must also specify requirements for them in the solicitation. Chapter 5 describes trainingand conversion requirements. Examples of software maintenance requirements include optimizing thedatabase each month and database table maintenance. Examples of user support requirements include

Page 59: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the hours a hotline is available and minimum turnaround time, by category of urgency, for problemsreported to the hotline.

b. Government and Contractor ResponsibilitiesA software development services acquisition requires the joint effort of the agency and the contractor.In addition to specifying what the contractor must provide, the agency must specify what the agencywill provide in areas such as: o Facilities -- This includes space for contractor personnel to work andoffice supplies, such as telephones and photocopying equipment. The contractor's personnel might visitthe agency's facility infrequently and require only access to a telephone, or they might be located on-site full-time for the duration of the contract. Software development services may be performed on-sitewhen the work requires using the agency's computing facilities and a high level of coordination withagency personnel. Contractors typically offer lower prices for on-site arrangements, although theGovernment must consider its own cost for providing space and supplies when calculating the totalcost over the life of the contract. o Computer Equipment and Software for the Development -- Thecomputer equipment and software required for a software development services contract will vary bycontract, but could include: oo A mainframe or minicomputer oo Disk space sufficient for establishinga separate development and testing environment oo Terminals, microcomputers, or workstations fordevelopers oo Telecommunications [e.g., Local Area Network (LAN), terminal to mainframe] ooDevelopment tools [e.g., programming languages, Computer-Aided Software Engineering (CASE)tools] For off-site work, the Government may require the contractor to provide the equipment andsoftware required for the new development. Typically, the contractor would charge the Government fortheir use. In this case, the equipment that the contractor provides remains in its possession. In addition,the Government might provide equipment and software the contractor lacks, particularly specialpurpose or unique items not available commercially. Alternatively, the Government could require thecontractor to acquire the equipment and software needed for development and, typically, the contractorwould bill the Government for them. In this case, the Government would normally take title to theequipment and "own" it. In a software development services contract, the agency should require thecontractor to acquire only the equipment and software needed for development. Requiring thecontractor to acquire the equipment needed to operate the software would turn the developmentcontract into a systems integration services contract, another type of FIP resources contract (see AGuide for Acquiring Systems Integration Services, another guide in this series). A third alternative isfor the agency to provide the equipment and software for contractor use. In deciding whether to supplyequipment and software required for development, the agency should weigh cost, resource, andschedule implications of acquiring the equipment itself vs. having the contractor acquire it. The agencyshould keep careful inventory records of equipment provided to a contractor and ensure its return at theclose of the contract. Equipment that the contractor acquires at the Government's direction and is paidfor through the contract should also be turned over to the agency at the end of the contract. oInformation -- Agencies will almost always provide information to offerors to help them in developingtheir proposals and in conducting the work after contract award. Information provided before award istypically either included as an attachment in Section J of the solicitation or in a solicitation libraryofferors can visit. The agency should identify the information that vendors might need, assemblematerials already written, and develop any necessary new materials (e.g., lists of source documents).Information provided would typically include the documentation the agency submitted with its AgencyProcurement Request (APR). At minimum, this would include the Requirements Analysis andAnalysis of Alternatives but might also include organization charts, mission statements, agencysoftware development standards, and descriptions of existing systems. o Staffing -- The agency shoulddecide whether it wants contractor personnel to perform all software development services or whetheragency personnel will perform some of them. For example, the agency may decide that it wants its ownpersonnel to train agency users of the software in regional offices. Using both agency and contractorpersonnel to support the same function makes it more difficult to hold the contractor accountable forquality and performance and provides opportunities for denial of responsibility (i.e., finger pointing).Dividing software and database design activities between the agency and the contractor is one exampleof this situation. Overlapping responsibilities could lead to claims from the contractor that, in

Page 60: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

producing the design, the agency's personnel added new requirements and, thus, changed the scope ofthe contract. Also, changes made by one group may not be communicated to the other, possiblyleading to design flaws that become more difficult to correct as the project proceeds. Finally, if theagency's employees fall behind schedule in producing their part of the design, the contractor may claimthat it is entitled to reimbursement of additional costs caused by the delay. To avoid such problems,clear divisions of responsibility between the Government and the contractor are essential.

c. Estimated Labor Hours by Personnel CategoryFor requirements contracts, the agency must include the estimated total quantity of services in thesolicitation and resulting contract. For indefinite quantity contracts, the agency must include minimumand maximum quantities. For software development services, these are usually stated as labor hours bypersonnel category. For other types of contracts, the Government typically requires the contractor toestimate the level of effort required and does not state its own estimate in the solicitation. However,agencies should estimate the number of hours required in order to evaluate offerors' technicalapproaches and cost reasonableness and realism.

d. Minimum Personnel and Corporate QualificationsThe agency should determine the minimum acceptable qualifications for offerors and their personnel toperform the required software development services. For personnel, the agency should determinequalifications in areas such as: o Minimum educational level (e.g., a Bachelor's degree), perhaps with aprovision to permit a certain number of years of experience to substitute for a certain level of educationor vice versa o Field of study (e.g., a degree in computer science or related field) o Minimum years ofexperience working with information technology o Minimum years of experience performing softwaredevelopment services o Minimum years of supervisory experience for managers o Minimum years ofexperience with a particular tool (e.g., programming language, database management system, orsoftware development methodology) The agency should establish realistic minimum qualifications.Requiring more years of education or experience will increase offerors' prices. Extra credit can begiven in the proposal evaluation for personnel with more experience or advanced degrees, if providedfor in the SSP and the solicitation. Agencies should set minimum corporate qualifications only ifnecessary to meet the agency's requirements for the software development services. Setting restrictivequalifications could lead to protests. However, the agency might establish minimum qualifications inareas such as: o Experience with the required hardware, software, or telecommunications environmentin terms of minimum number of years or projects o Experience with a particular non-proprietarymethodology or technique [e.g., Joint Application Development (JAD)] in terms of minimum numberof years or projects o Experience with a particular function (e.g., inventory control, financialmanagement) that relates to the software to be developed in terms of minimum number of years orprojects o Number of software development projects of a similar size

e. Deliverable Items and Delivery ScheduleThe agency should specify what the contractor must deliver and when. Both early (i.e., before codingbegins) and some later deliverables will generally be written materials (e.g., a system manual, user'sguide, training documentation). For these, the agency should describe the following: o Purpose oTopics to cover o Length, if appropriate o Format (e.g., written report, briefing package) o Number ofcopies o Media (e.g., paper copy, electronic copy), including electronic format [e.g., word processingpackage XYZ, American Standard Code for Information Interchange (ASCII)] o Delivery dates oAcceptance criteria Some later deliverables may also include non-written materials (e.g., executablecode). For these, the agency should specify the delivery date; electronic format standards, if deliveredon electronic media; installation requirements, if the agency requires installation on the agency'scomputer; and acceptance criteria.

Page 61: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

f. Interim Deliverables and ReviewsThe agency should define contract tasks so that the delivery of the operational software is not the firsttime it reviews and comments on the overall design, the user interface, the supporting documentation,or the physical partitioning of data. By identifying interim deliverables, the agency can shape theoperational software during its development. Not all interim deliverables need to be written documents.For example, the contractor could provide a briefing about the design. The agency can also require thatthe contractor provide periodic reviews and walkthroughs of components of the software program. Inaddition, the agency can make provisions in the contract for unannounced audits of work in progress.These deliverables permit the agency to review and modify the technical direction of the softwaredevelopment. The agency should also require the contractor to provide reports of the results of itsinternal testing of the software during development. These reports should discuss not only the progressof tests (e.g., X number of unit tests conducted during the past week), but also the nature of the errorsdiscovered, the measures taken to correct them, and any impacts on the delivery of the software to theGovernment for acceptance testing.

g. Performance MeasuresThe agency should define in the contract, in precise terms, both quantitative and qualitativeperformance measures that it will use to determine quality. Well-defined measures make it easier todetermine performance quality and help to avoid misinterpretations and possible disputes later. (1)Quantifiable Performance Measures The agency should identify quantifiable performance measuresagainst which to measure the contractor's performance. Quantifiable measures have historically beendifficult to develop for professional FIP support services such as software development. Measures suchas "lines of code produced" provide little useful information and can be misleading. The lines of codecan vary significantly, depending on the programming language and the complexity of the system.Also, elements designed to make the software easier to use, such as GUIs or artificial intelligence, canadd significantly to the lines of code required. However, in recent years, the software industry hasdeveloped metrics to measure the quality of software. For example, in the late 1970s, A.J. Albrechtdefined a metric called "function points." It includes weighted combinations of inputs to the software(weight of 4), the outputs from it (5), inquiries by users (4), the data files updated by the software (10),and interfaces to other systems (7). The weights represent the approximate difficulty of implementingeach of the five factors. Agencies should seriously consider defining the size and complexity of theirsoftware in terms of function points rather than lines of code. Other quantitative metrics that canprovide useful information on the quality of the software produced by a contractor and its progresstowards completion of the software system include: o Staffing profiles, comparing actual staff toplanned staffing o Progress against schedules and milestones for each stage of the development life-cycle o Software volatility: oo Number of changes to requirements (especially in the later stages of thelife-cycle) oo Number of engineering changes requested by the agency and the contractor oo Numberof design or code changes needed because the contractor misunderstood or ignored requirements oStatistics on errors in: oo Design oo Coding oo Testing o Number of software units (e.g., programs)completed, over time, and in comparison to plan: oo Developed oo Unit tested oo Integrated o Testcoverage (i.e., percent of code or logic paths tested) o Machine utilization: oo Central processing unit(CPU) capacity oo Main memory [also called Random Access Memory (RAM)] oo Disk storage spaceoo Throughput oo Input/output channel utilization o Operational statistics: oo Response time (mostuseful if related to specific events, such as a query of a given degree of complexity, rather than anoverall measure for the software) oo Mean-time-between-failures (MTBF) or software "crashes" ooWorkload processed per unit of time oo Number and nature of calls to a software hotline (2)Qualitative Software-Specific Measures The agency should identify other quality measures for thesoftware developed by the contractor, such as: o Correctness -- The extent to which the softwaresatisfies the contract specifications and the agency's requirements o Reliability -- The extent to whichthe software performs its specified functions with the required precision o Security/Integrity -- Theextent of controls to prevent unauthorized access to the software or data o Usability -- The effortrequired to learn, operate, prepare input to, and interpret output from the software o Maintainability --The effort required to detect, locate, and correct errors in the software, and maintain the documentation

Page 62: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

o Flexibility -- The effort required to modify the software o Portability -- The effort required to movethe software from one hardware and system software platform to another o Reusability -- The extent towhich portions of the software may apply to other software applications o Interoperability -- The effortrequired to integrate one software system with another (3) Compliance with Documentation StandardsThe agency should identify documentation standards against which it will measure writtendeliverables. For example, Federal Information Processing Standards Publication (FIPS PUB) 64provides outlines for systems development documents. In addition, individual agencies may have theirown documentation standards specified in their software development methodology or an agencywidewriting manual. (4) Compliance with Statement of Work (SOW) The agency should identifyperformance measures directly related to the SOW. These may include both objective and subjectivemeasures. For example, the SOW may describe some tasks the contractor has to complete with detailsabout how to conduct them (e.g., following a methodology required by the agency). One performancemeasure could be whether the contractor completed these tasks in the way the agency specified. TheSOW will also specify functional or performance requirements for the software. The agency maydefine, as a performance measure, the degree to which the contractor met the requirements in thecompleted software. The agency can define metrics or other quantifiable measures for somerequirements (e.g., workload processed per unit of time). Other SOW requirements may require morejudgmental qualitative measures (e.g., usability).

h. Review, Approval, and Acceptance ProceduresIn addition to defining performance measures, the agency should state in the contract how it willreview and approve the development services [e.g., using an Independent Verification and Validation(IV∧V) contract], the software itself, and any other services (e.g., training) the contractor provides.For example, the agency might use the following process for a program specifications report preparedunder a firm-fixed-price contract: o The contractor will deliver a draft version of the specifications tothe Contracting Officer's Technical Representative (COTR). o The COTR will deliver copies toappropriate agency personnel for review and comment. o The contractor will conduct a structuredwalkthrough of the specifications for the agency within X days. o The COTR will determine whetherthe specifications comply with the standards (e.g., FIPS PUB 64) established in the contract, whetherthey meet the format and level of detail required in the SOW, whether they follow the outline andexamples provided in the contractor's proposal or previously submitted workplan, and whether theyreflect all agency requirements for the new software. o The agency will return its comments to thecontractor within Y working days, and the contractor will then have Z working days to incorporate thecomments and deliver final program specifications. o Upon receipt of the final specifications, theCOTR will determine if all agency comments have been incorporated or if the contractor has justifiednot incorporating some of them. o The COTR will recommend acceptance or rejection of the finalprogram specifications by the CO. Only upon acceptance will the Government pay the contractor'sinvoice.

i. Financial Measures(1) Incentives OFPP Policy Letter 91-2 (reproduced in Appendix F) recommends that agencies usefinancial incentives to promote quality. Financial incentives vary by contract type, but may includearrangements such as cost-plus-award-fee, cost-plus-incentive-fee, and fixed-price- incentive. SeeSection 9.5 for a discussion of these contract types. (2) Liquidated Damages To ensure on-timedelivery of services, the agency may include liquidated damages provisions in the contract. If thecontractor fails to perform, except for excusable delays, the Government may seek liquidated damagesor charge the contractor for actual costs attributed to the failure. To seek liquidated damages, thecontract must include the right to and a formula for computing damages directly related to the harmincurred. Liquidated damages may not be punitive. Contracts that contain liquidated damages coverageusually assess damages at a fixed daily rate for late deliveries. Chapter 12 discusses liquidated damagesin detail. (3) Deduction Schedules The agency may also provide quality assurance (QA) deductionschedules in the contract. The schedules identify the financial damages the agency will assess if thecontractor fails to meet established quality standards. QA deduction schedules are most appropriate for

Page 63: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

services with objective and quantifiable performance measures; using them for software developmentservices contracts is more difficult.

j. Status Reports and Progress BriefingsThe contract should require the contractor to provide written status reports on a periodic basis,typically monthly. When planning the solicitation, the agency should identify only what it needs tomonitor and control the contract. Requiring too frequent or unnecessary reports results in offerorsincreasing their proposed costs to compensate for the additional effort or, after award, diverting someresources from providing the software development services to generating the reports. The informationin the status report should address each task the contractor has underway and include: o Activitiesconducted and deliverables provided during the reporting period o Activities and deliverables onschedule, ahead of schedule, or behind schedule, and the effect on future activities and deliverables oActivities and deliverables scheduled during the upcoming reporting period o Issues, problems, orconcerns that could affect the contractor's technical, schedule, or cost performance o Results of anydecision meetings held between the agency and the contractor o Anticipated changes in key personnelo Resources expended (hours) and costs incurred during the reporting period for cost-reimbursabletasks; an estimate of percentage completion for fixed-price tasks o Other information the agency needsor requests In addition to written status reports, the agency may also ask the contractor to provideperiodic progress briefings. Supplementing the one-way communication of written status reports, thesebriefings provide a forum for questions and answers, as well as informing senior management aboutcontract status. Progress briefings may also address technical aspects of the contract work (e.g.,preliminary design recommendations). The frequency of these briefings depends on how quickly taskactivities change and the degree of monitoring the contract needs, keeping in mind that progressbriefings are usually billable activities.

k. StandardsThe agency should identify Governmentwide, agency, or industry standards the contractor shouldfollow. Standards reflect extensive input from other agencies and industry and are usually wellunderstood, fully documented, noncontroversial, and widely supported. Therefore, using them providesan unambiguous yardstick for determining contractor performance. The FIPS PUBS discussed inChapter 5 include software life-cycle documentation standards (e.g., FIPS PUBS 38, Guidelines forDocumentation of Computer Programs and Automated Data Systems, and 64, Guidelines forDocumentation of Computer Programs and Automated Data Systems for the Initiation Phase), whichthe agency can cite in the solicitation at its discretion. The agency may also have its own standards. Forexample, the agency's internal software development methodology may provide outlines and formatsfor written deliverables and dictate procedures for software development (e.g., procedures forstructuring code). In addition, agencies typically have their own security standards for protectingclassified, sensitive, or private information.

l. Development MethodologyChapter 3 provides guidance on whether or not to specify a particular development methodology in thesolicitation. If the agency decides to specify a methodology, it should make the methodologydocumentation and any software tools that are part of it available to offerors in the solicitation library.If the agency decides not to specify a methodology, it has two options. First, it could ask offerors todescribe the methodology they intend to use and the benefits it offers the agency. The agency wouldevaluate this information according to factors and subfactors it specifies in the solicitation. Second, itcould require that proposed methodologies include a certain minimum number of steps or activities(e.g., consider opportunities for prototyping). Again, the agency would evaluate this informationaccording to factors and subfactors it specifies in the solicitation.

Page 64: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

m. Technical Development EnvironmentIn addition to specifying the technical environment requirements described in Chapter 5 (i.e., hardware,system software, telecommunications, integration with other systems), the agency should specify thedevelopment environment (e.g., programming languages, CASE tools) that it requires the contractor touse, if applicable.

n. Software MaintenanceUnless the agency awards a separate contract for maintenance, its solicitation should state the type ofmaintenance support required for the software. Maintenance for custom-developed software focusesprimarily on correcting errors discovered through testing and use. The agency may require that sucherrors be corrected at no cost to the Government for a specified time after acceptance. The agency mayalso identify requirements for manual or automated systems to track software problems and theirresolution. Whenever the agency or contractor changes a component of custom- developed software,the change can affect other components. Configuration management services (described below) cancontrol these impacts.

o. Configuration ManagementConfiguration management refers to the process of establishing a baseline software version andstructured tracking of additional versions. This includes making sure that test and production versionsare kept separate and that software installed throughout a distributed environment is the same version.Configuration management can also refer to the management of different versions of the softwarebefore installation. Configuration management is an important function for large systems with largenumbers of programs and databases. It is also important for systems in which software and/or data aredistributed over several locations. Poor configuration management can slow the development processand cause reliability problems after installation. To ensure good configuration management, agenciesoften require offerors to submit a configuration management plan with their proposals (sometimes aspart of a larger project plan) which the agency evaluates according to the factors and subfactors statedin the solicitation. Depending on the agency's situation, these might include factors such as: o Abilityto distribute upgrades to all regional offices overnight o Ability to revert to a previous version in caseof a problem with an upgrade o Ability to produce up-to-date configuration management informationwithin a specified time period o Demonstrated experience with configuration management for an effortof similar size and complexity

p. Site SecurityIf the contractor's personnel will spend any time on-site, the agency should ensure that it has securityprocedures in place to protect sensitive information not needed in the development effort (e.g.,information about future acquisitions). The agency should ensure that a contractor working off-site hassecurity procedures in place to protect sensitive information (e.g., names and addresses of an agencyprogram's beneficiaries being used for live test data).

q. Personnel SecurityThe agency should establish the security and privacy requirements the software developmentcontractor must meet. For example, if the contractor is developing a sensitive agency system (e.g.,financial management), the contractor's personnel may have to undergo Government securityinvestigations. Similar requirements exist for contractor personnel with access to sensitive data (e.g.,taxpayer information). Check agency-specific regulations for these requirements.

r. Records ManagementAlthough not a solicitation requirement, the agency should identify, for internal use, the physicalrecords that it will maintain for the software development services acquisition. The records couldinclude contract files, copies of deliverables, briefing packages, status reports, and other documents

Page 65: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

that are evidence of the contractor's efforts. This evidence: o Serves as the basis for determining awardand incentive fees or other possible contractual actions (e.g., default) o Permits distribution ofinformation to personnel who did not work closely with the contractor o Provides for the agency'spotential use to support later activities (e.g., a general design document provides backup for laterreview of the program specifications) o Provides an audit trail that shows the contractor met the termsof the contract The agency should identify who is responsible for maintaining the records, how longthey should be retained, and how to dispose of them.

s. BudgetIn the planning stage described in Chapter 4, the agency submits its budget request to OMB. At thispoint, the agency should refine its budget for the acquisition based on the information learned sincethen.

t. Contract LifeThis describes the time frame within which the agency expects to use the software developmentservices. Base the contract life on the estimated time required to design, develop, test, and install thesoftware, and maintain it, if maintenance is acquired. For large software development servicesacquisitions, the contract life will normally be between two and five years.

9.5 SELECTING A CONTRACT TYPEContract types are grouped into two broad categories: fixed-price and cost-reimbursement. Generally,the contractor assumes more risk (of losing money) in fixed-price contracts and less risk in cost-reimbursement contracts. Conversely, the Government assumes more risk (of paying more than itexpected) in cost-reimbursement contracts and less risk in fixed-price contracts. Time-and-materials(T∧M) and labor-hour contracts, sometimes used for software development services, provide littleincentive for cost control or labor efficiency. FAR Part 16 says to use them only if no other contracttype is suitable.

a. Contract TypesFAR Part 16 identifies some of the factors an agency should consider in choosing (or negotiating) acontract type, including: o Acquisition method -- Cost-reimbursement contracts are appropriate onlyfor negotiated acquisitions. Contracts resulting from sealed bidding must be firm-fixed-price (FFP) orfixed-price with economic price adjustments (FPE). Economic price adjustments provide for upward ordownward revisions of prices based on rules stated in the contract, covering contingencies such aschanges in the contractor's underlying costs of labor or material. o Price competition -- Normally,adequate price competition results in realistic pricing. When software requirements are well-definedand the technical environment is well- understood, offerors can estimate their costs with a reasonabledegree of accuracy. In such cases, fixed- price contracts may be in the Government's best interest. oType and complexity of requirements -- Complex requirements, particularly those unique to an agency(e.g., software development services for an expert system using agency-specific decision rules), usuallyresult in contract types in which the Government assumes greater risk than the contractor. o Urgency --If urgency is a primary factor, the agency may choose to assume a greater proportion of the risk or itmay offer incentives to ensure timely performance. o Period of performance -- Contracts extendingover a relatively long period may include provisions for economic price adjustments. Also, changingcircumstances may make a different contract type more appropriate in later periods than at the outset. oExtent and nature of proposed subcontracting -- If the agency anticipates that offerors will subcontractextensively, it should choose the contract type that reflects the actual risks to the prime contractor. Theprime contractor is free to negotiate its own contracts with subcontractors. Except for small purchasesor fixed-price contracts, the CO usually includes documentation in the contract file showing why aparticular contract type was selected. (1) Fixed-Price Contracts The most common type of fixed-pricecontract used for software development services acquisitions is FFP in which the contractor agrees todevelop software meeting specific requirements at a specific price. The firm-fixed-price is not

Page 66: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

adjustable no matter what it costs the contractor to develop the software. While this carries the greatestdegree of financial risk for the contractor, it also offers the greatest potential for profit. Thus, the firm-fixed-price contract encourages efficiency. Fixed-price contracts are most appropriate when therequirements are unambiguous and known in detail, costs can be predicted with an acceptable degreeof certainty, and adequate price competition exists in the marketplace (e.g., the development servicesare for software supporting common Government functions that will operate in a standard technicalenvironment). (2) Cost-Reimbursement Contracts The most common type of cost-reimbursementcontract for software development services is cost-plus-fixed-fee (CPFF). For this, the offeror firstestimates the total cost of performing the work. The dollar amount of the fixed fee is then calculated,usually by applying an assumed profit margin to the cost estimate. The fee is negotiated and the dollaramount fixed at contract award. Cost reimbursement type contracts require closer contractadministration and more in-depth knowledge of FAR cost principles than fixed-price contracts. As thecontractor performs the work, the Government reimburses it for allowable costs as provided in thecontract. Some expenses (e.g., travel) are subject to limits. Others (e.g., entertainment) are notreimbursable at all. Agency and outside auditors, such as the Defense Contract Audit Agency (DCAA),can be used to monitor the contractor's costs. If, at the end of the work, the contractor's costs are belowthose estimated at contract award, the dollar amount of the fixed fee represents a higher profit margin.On the other hand, if the actual costs exceed the original estimate, the fixed fee represents a lowerprofit margin. Thus, fixing the dollar amount of the fee encourages the contractor to perform at orbelow the original cost estimate. CPFF contracts have two forms: completion and term. Undercompletion, the contractor is obligated to complete the work. Under term, the contractor is obligatedonly to provide a specified level of effort during a specified time. The completion form is preferredover the term form because the contractor has a greater obligation to deliver a useable product. Cost-reimbursement contracts are most suited to situations in which the Government's requirements cannotbe stated in detail or the contractor must use an unproven or nonstandard technical environment todevelop the software. However, if an agency changes its requirements (though staying within the scopeof the contract), the contractor's costs may increase. If a contract modification is required toaccommodate the new requirements, the contractor may petition the CO for a corresponding increase inits fee. To use a cost-reimbursement contract, the contractor must have an accounting system thatsupports monitoring and auditing functions. The CO should ensure such an accounting system is inplace before awarding the contract. For cost-reimbursement contracts, the CO may negotiate limits forsuch costs as contractor overhead and administrative expenses before contract award. If an audit at theend of the contract shows the actual costs were higher, the contractor absorbs them; if lower, theGovernment and contractor settle before contract closeout. The Government must have sufficientmonitoring mechanisms in place to provide reasonable assurance that the contractor is managing itscosts efficiently. (3) T∧M Contracts Some software development services may be acquired using aT∧M contract. It provides for direct labor at fixed hourly rates (including contractor overhead,administrative loading, and a profit margin) and materials (including both non-labor components andexpenses, either at cost or with administrative loading added). A T∧M contract may be used onlywhen it is not possible to predict accurately the number of labor hours required and no other contracttype is suitable. However, because a profit margin is built into every hour worked, the contractor haslittle incentive to perform efficiently. Therefore, the contract must establish ceilings for the total dollaramount the contractor can bill the Government. Some T∧M contracts also establish ceilings on thenumber of labor hours, either in total or by labor category. The agency must also monitor the contractclosely to ensure the contractor uses efficient methods and effective cost controls. (4) IncentiveContracts Except for T∧M, each of the contract types discussed above can encourage thecontractor to control costs (i.e., in FFP contracts the contractor will lose money if its costs exceed theprice; in cost-reimbursement contracts, the contractor's profit margin will be reduced if costs exceed theoriginal estimate). Agencies may also find it worthwhile to incorporate award or incentive feeprovisions, which reward high quality and/or fast delivery, into the contract. Incentive and award feesdiffer in the way they are determined. Incentive fees are based on quantifiable factors, one of which isalways cost. Other factors may include technical or schedule performance. Award fees are based onjudgmental factors, such as quality or technical ingenuity. However, in the contract, the agency mustdescribe fully the factors it will consider in determining either the incentive or award fee. This protects

Page 67: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the agency from contractor claims that the fee was too low and charges by Government oversightagencies that the fee was too high. Awards and incentives must be structured to encourage the behaviorthe agency intends. For example, if an incentive provision requires the contractor to deliver by a certaindate, but places no emphasis on quality, the agency could receive poor quality, although timely,deliverables. Also, to be motivated by an incentive or award fee, the contractor must understand the feedetermination process and perceive it as fair. For example, incentive provisions based on unrealisticdelivery dates will also motivate the contractor to emphasize timeliness over quality. Before decidingto use award or incentive fee provisions, agencies should determine how difficult they will be toadminister. Because they require the CO to maintain substantiating documentation, agencies shouldconsider using award or incentive fee provisions only if their benefits exceed their administrative cost.Agencies can use incentive or award fee provisions with either fixed-price or cost-reimbursementcontracts in any of three combinations: Cost-Plus-Award-Fee (CPAF), Cost-Plus-Incentive-Fee (CPIF),or Fixed-Price-Incentive (FPI). The agency can also limit the incentive provisions to apply to only asubset of the services. Available award and incentive programs and their applicability to softwaredevelopment services are described below. A. Cost-Plus-Award-Fee The Government pays allowablecosts, base fee, and award fee. The contractor earns a base fee that does not vary with performance and,in addition, earns all or part of an award fee based on the Government's unilateral evaluation of thecontractor's performance in terms of the criteria stated in the contract. The CO (with input fromprogram and IRM/technical personnel) determines the amount of the award fee. The agency evaluatesthe contractor's performance at stated intervals and makes corresponding partial payments of the fee.Because they emphasize quality, cost-plus-award-fee contracts are usually the most appropriate of theaward/incentive contracts for software development services. B. Cost-Plus-Incentive-Fee TheGovernment pays allowable costs and an incentive fee designed to motivate the contractor to managecosts effectively. A formula that compares actual costs to target cost (i.e., the originally negotiated costestimate) determines the fee. Subject to negotiated minimum and maximum limits, the formulaprovides for fee increases when actual costs are lower than target costs and decreases when actual costsexceed target costs. The increase or decrease provides an incentive for the contractor to manage thecontract effectively. CPIF contracts are most applicable when costs cannot reasonably be determined atthe outset and the agency wants the contractor to make controlling costs a high priority. However,CPIF contracts may also include technical performance incentives for development projects when theprobability of success is high and the Government can establish measurable performance objectives. C.Fixed-Price-Incentive The Government pays a price equal to the sum of the final negotiated cost andprofit (i.e., incentive). A formula that compares final negotiated cost (i.e., the costs the Governmentagrees to reimburse at the contract's end) with target cost (i.e., the contractor's original cost estimate)establishes the final contract price. The price is subject to a ceiling determined during negotiations. FPIcontracts are less applicable to software development services than the other two types of incentivecontracts because the contractor may have to assume some cost responsibility and will never be paidmore than the price ceiling. Because software development requirements often carry risk for thecontractor (e.g., requirements are not fully defined at the outset), contractors are often not willing toassume this risk.

b. Issuing Agencywide Indefinite Delivery ContractsWhen an agency acquires software development services frequently but is not certain of the exactamount and/or timing of services it will require, it may issue an indefinite delivery contract (also calleda task order or delivery order contract) for use by its various offices. However, indefinite deliverycontracts are not open-ended. They must state a definite period of performance. The three types ofindefinite delivery contracts are: o Definite Quantity -- States a definite quantity (e.g., a specificnumber of hours) of software development services o Requirements -- Provides for filling all anagency's needs for software development services o Indefinite Quantity -- States minimum andmaximum quantities (e.g., hours) of software development services Agencies use the second two typesmost frequently for software development services. FAR Part 16 discusses indefinite delivery contractsin greater detail. Indefinite delivery contracts offer agencies several advantages: o Limited Governmentobligation -- Indefinite quantity contracts limit the Government's obligation to the minimum quantity

Page 68: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

specified in the contract. For software development services contracts, this means that the agency canspecify optional tasks (e.g., pilot testing at regional offices) that it is not obligated to exercise (e.g., ifrequirements change or if a lower than anticipated budget does not allow the agency to exercise all ofthe task options). o Funds are not obligated immediately -- For requirements contracts, funds areobligated by delivery order, not by the contract itself. For indefinite quantity contracts, the agencymust obligate funds for the minimum quantity at contract award. However, above the minimum, theagency obligates by delivery order. This allows the agency more discretion in using its funds. Forexample, if priorities change, the agency is free to divert funds to a higher priority project. o Fasteracquisition -- When an agency awards an indefinite delivery contract, it needs less time to exercise adelivery order than it needs to award a new contract. o More predictable costs -- Rate structures forindefinite delivery contracts are often negotiated at the outset. Therefore, agencies may only need tonegotiate the level of effort (i.e., number of hours and personnel mix) with the contractor for individualdelivery orders. For budgeting in advance of negotiations, agencies can usually estimate a reasonablelevel of effort, apply this to the agreed-upon rates, and use the total in the budget. o More predictablecontractor performance -- If the indefinite delivery contract is available for agencywide use, offices canexchange information concerning the contractor's technical capability, responsiveness to client needs,ability to meet schedule and cost constraints, etc. This encourages the contractor to provide effectiveservice. A limitation of indefinite delivery contracts is that the agency must bind the contract in termsof: o Period of performance o For indefinite-quantity contracts, the maximum quantity or value for aspecified period of time (e.g., by contract year) o Monetary or quantity ceilings by task order A GUIDEFOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 10: CONTRACTING BY NEGOTIATIONAgencies can acquire software development services by a number of different methods. This chapterdiscusses the most common, contracting by negotiation. Chapter 11 discusses other methods (smallpurchase procedures, established sources of supply, and sealed bidding). The Federal AcquisitionRegulation (FAR) Part 15 and the Federal Information Resources Management Regulation (FIRMR)prescribe the policies and procedures for contracting by negotiation. Negotiation (1) involves issuing asolicitation and receiving proposals, (2) permits holding discussions and negotiations with offerors inthe competitive range, and (3) usually provides offerors the opportunity to revise their proposals beforecontract award. The agency awards the contract to the offeror whose proposal is most advantageous tothe Government, considering all evaluation factors included in the solicitation. Another guide in thisseries, A Guide for Evaluating Proposals and Bids, provides detailed information on evaluation andsource selection approaches.

10.1 ATTRIBUTES OF NEGOTIATED CONTRACTS

a. Differences Between Negotiation and Other Contracting MethodsAccording to FAR Part 15, any contract awarded without using sealed bidding is a negotiated contract.It differs significantly from sealed bidding in the following ways: o Form of the solicitation -- TheGovernment states its requirements in the form of a Request for Proposals (RFP), not an Invitation forBids (IFB), and vendors respond with proposals that usually contain significantly more informationthan a bid. o Disclosure of information -- Unlike bids, proposals are not available for public inspection.o Negotiations/discussions -- The Government can negotiate with offerors in the competitive rangeabout price, schedule, technical requirements, type of contract, or other terms of the proposed contract.Discussions can cover achieving a mutual understanding of requirements or bargaining for morefavorable terms or both and provide the offeror an opportunity to revise or modify its proposal. oLatitude in specifying requirements -- Because discussions are allowed, the specifications in thesolicitation can be broader and functionally oriented. o Factors to consider in making the award --Source selection officials have greater discretion in awarding a contract to an offeror whose proposal ismost advantageous to the Government, considering all factors, not just total cost. The Government canconsider the technical merit of proposals exceeding minimum requirements, if this is provided for in

Page 69: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the solicitation. Therefore, the quality of the proposal, an important factor in software developmentservices acquisitions, can be considered. o Allowable contract types -- Agencies can use any contracttype described in FAR Part 15. Negotiated contracts usually require more Government time andpersonnel than sealed bidding contracts because the agency must develop evaluation factors, ratetechnical proposals, compare technical and cost proposals, and conduct discussions and negotiations --steps not performed in sealed bidding contracts. Offerors also often invest more resources respondingto a negotiated contract than to a sealed bidding contract. Negotiated contracts are also moresusceptible to protest than other methods because they are more complex, the evaluation requires theexercise of judgment, and the results may be questioned more easily.

b. Contract AttributesWhen conducting a negotiated contract, the agency must first determine that conditions for sealedbidding cannot be met. The agency then must determine the contract attributes, such as contract type(e.g., cost-reimbursable, fixed-price), definite delivery vs. indefinite delivery, number of awards, etc.

10.2 THE SOURCE SELECTION PLAN (SSP)The SSP outlines how the agency will select the most advantageous offer. It describes the-- o SourceSelection Team (SST) -- Who will evaluate the proposals and how the team will be organized oEvaluation factors -- The attributes of the Government's requirement on which proposals will beevaluated o Source selection approach -- How the Government will trade-off between cost andtechnical considerations o Source selection schedule -- When evaluation activities and contract awardwill take place The level of detail and comprehensiveness of the plan depend on the size, scope, andcomplexity of the acquisition. For a small software development services acquisition (e.g., amicrocomputer- based employee database for a small regional office), the plan need not be developedto the same level of detail as for a large, complex acquisition (e.g., agencywide, mission-criticalsoftware).

a. Source Selection Team OrganizationThis Guide uses the term SST to cover all individuals involved in proposal evaluation and contractaward processes for a negotiated contract. FAR Part 15 specifically identifies responsibilities for threeroles: o Agency heads (or their designees) -- Responsible for source selection o Cognizant technicalofficial -- Responsible for the technical requirements related to the source selection process. Thisofficial should have the requisite depth of knowledge in the program to be supported or the products orservices to be acquired. o Contracting Officer (CO) -- Responsible for all contractual actions related tothe source selection process, including: oo Issuing the solicitation oo Conducting or coordinatingprice/cost analyses oo Conducting and controlling negotiations oo Selecting the source for contractaward unless another official is designated as the Source Selection Authority (SSA) For most softwaredevelopment services acquisitions, the CO serves as the SSA. The SSA appoints the rest of the SST,which is typically divided into two panels. o Technical Evaluation Panel (TEP) -- Evaluates technicalproposals o Cost Evaluation Panel (CEP) -- Evaluates cost or price proposals and total cost Dependingon the size and/or riskiness of the acquisition, agencies may augment this organization with a SourceSelection Evaluation Board (SSEB) and/or Source Selection Advisory Council (SSAC) to review theTEP's and CEP's evaluations and perform a comparative analysis. The number of members on the TEPand CEP depends on the types of expertise required, the number of proposals anticipated, and the timeallotted for completing the source selection process. In performing their functions, the TEP and CEP donot interact directly with each other. This prevents cost considerations from influencing the technicalevaluation. Instances may occur, however, when limited information must be exchanged through theCO. For example, the labor categories for which an offeror submits prices or costs in the cost proposalmust be the same categories as in the technical proposal. As another example, if an offeror assumesthat the Government will provide a software development tool not specified in the solicitation, this costto the Government must be factored into the total cost evaluation. The CO tightly controls anyexchange of information to ensure no direct interaction takes place between the two panels.

Page 70: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Determining the Evaluation FactorsThe SST uses evaluation factors to determine the relative merit of proposals. Acquisitions for softwaredevelopment services typically use two types of factors: price or cost and technical. (1) Price or CostFAR Part 15 requires price or cost as an evaluation factor in every source selection. It defines price asthe cost paid to the contractor plus any fee or profit applicable to the contract type (e.g., award orincentive fees for cost-reimbursable contracts). FIRMR Part 201-39 defines total costs to include: o Allprices for Federal information processing (FIP) resources, including basic and optional quantities,contract periods, and optional FIP resources o Other support and in-house costs over the system life forinstalling, operating, and disposing, where quantifiable and when these costs may differ based onoffers received o Any costs of conversion that can be stated in dollars, as well as other costs directlyrelated to converting from installed to augmented or replaced FIP resources. However, the costsassociated with the following shall not be included: oo Conversion of existing software and databasesthat are to be redesigned regardless of whether or not augmentation or replacement FIP resources areacquired oo Purging duplicate or obsolete software, databases, and files oo Development ofdocumentation for existing application software oo Improvements in management and operatingprocedures (2) Technical or Quality FAR Part 15 also requires that every source selection includeevaluation factors that measure quality even when the award is based on the lowest total cost. Forsoftware development services acquisitions, evaluation factors that determine the quality of an offeringare often known as "technical" evaluation factors. Neither the FAR nor the FIRMR dictates specifictechnical factors; thus, agency acquisition officials can determine the factors that apply to a specificacquisition for their specific needs. Each factor may be itemized in levels of greater detail; however,the agency should try to limit the total number of items considered. Too many result in no single itemhaving a significant impact on the overall rating. This leads to a leveling of ratings, making it difficultto choose among offerors. Technical evaluation factors that agencies use to evaluate the quality of asoftware development services offering often measure quality in the areas described below. A.Proposed Approach Agencies typically require offerors to submit, as part of their proposals, an overallplan for developing the software. Agencies can judge, in part, the quality of an offeror's softwaredevelopment services by the soundness and comprehensivess of this plan. Agencies can include in thesolicitation technical evaluation factors that address particular aspects of a plan, such as the offeror'stechnical approach and plan for managing the project. B. Personnel Qualifications For softwaredevelopment services, the people performing the work will have the greatest effect on quality. Thus, topredict quality, the agency should carefully examine the skills, training, and experience of contractorpersonnel. The primary mechanism for this examination is the resume, although the agency may alsoask offerors to summarize some information about personnel in tables in their proposals. Theinformation that resumes and tables provide includes: o Education -- Personnel with more years ofpost-secondary education tend to be more knowledgeable and, thus, provide higher quality than peoplewith fewer years. In addition, people with degrees in relevant fields (e.g., computer science) shouldhave the ability to provide higher quality than those with degrees in unrelated fields. Since FIPresources in general and software development techniques in particular change rapidly, the agencyshould examine more than formal college or graduate education. People who have attended trainingclasses on a regular basis, keeping their skills and familiarity with new techniques and tools up-to-date,should be able to provide higher quality than those relying solely on college training. o Years ofexperience -- Personnel with more years of relevant work experience should provide higher qualitythan those with fewer years. Since education and on-the- job experience both enhance skills, agenciesmay allow offerors to partially substitute one for the other in meeting solicitation requirements (e.g., aBachelor's degree might substitute for three years of work experience). o Relevance of experience --For software utilizing specific technologies (e.g., digital imaging) or products (e.g., a databasemanagement system package used by the agency) or supporting specific functions (e.g., financialmanagement, inventory control), personnel with experience in these areas should provide higherquality than those with no experience in the technologies, products, or functions. The agency shouldalso consider whether skills in similar, although not identical, situations can be adapted to the agency'sparticular needs. C. Corporate Qualifications The information provided by corporate qualifications,especially relevance of experience, is similar to that provided by the personnel qualifications. The

Page 71: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

contractor should have the ability to apply the lessons learned from previous efforts when providingnew software development services to the agency. However, the value of corporate experience can onlybe realized if the same contractor personnel who worked on the previous efforts can support the newcontract, at least as internal experts if not full members of the team. For this reason, corporatequalifications are not as effective a predictor of quality as personnel qualifications. However, they canoften show how well company management monitors work and resolves problems. The primarymechanism for examining corporate qualifications is project descriptions included in offerors'proposals. The agency may also ask offerors to summarize information in tables for easier review.Finally, the agency may ask offerors to answer specific questions (e.g., Has the offeror ever had acontract terminated for default? What is the annual turnover rate of the offeror's software developmentpersonnel?). The agency should also seek information about the depth of experience of the offeror'spersonnel. For example, an agency issued a request for proposals for development of software to run ina Portable Operating System Interface (POSIX) computer environment. Two offerors provided strongresumes for key personnel in their proposals. However, 50% of Offeror A's personnel (including non-key people) had POSIX experience, while none of Offeror B's personnel did. In this example, OfferorA is more likely to provide high quality services. D. Reference Checks By skillful writing of resumesand project descriptions, a vendor may appear well qualified, predicting high quality. However, surfacegloss may hide defects underneath. For this reason, agencies should check references to validate theinformation in the resumes and project descriptions. Agencies should ask offerors to provide referencesfor proposed personnel and for projects cited in the proposal. Someone knowledgeable about theagency's requirements should then check the references by telephone or in writing. Reference checkscan validate: o The truthfulness and completeness of the descriptions of experience o How thereference's situation (e.g., mission, organizational structure, development environment, functionssupported by new software) applies to the agency's situation o Whether, in the reference's opinion, thevendor and personnel provided high quality o Whether any problems occurred in schedule or costperformance that might predict similar problems for the agency checking the reference E. OralDiscussions The agency might also consider meeting with the offeror's key personnel as part of theproposal evaluation process. The evaluators could ask them about the vendor's technical approach ortheir experience with particular languages or automated tools. Such discussions are useful forvalidating information provided in the offeror's proposal. F. Sample Deliverables The agency can makeits own assessment of quality by asking vendors to submit sample written deliverables with theirproposals. These could be actual deliverables (e.g., logical design documents, users manuals)developed for other clients. While written deliverables cannot provide information on all aspects ofsoftware development (e.g., code generation), they are useful for assessing the quality of some aspects.At the least, they can provide useful information on the quality of the written deliverables thecontractor will have to deliver to the agency. The agency can review the sample deliverables for: oCompleteness o Organization, including indexing and cross-referencing o Quality of writing oUnderstanding of the agency's mission, activities, and development environment G. Vendors' InternalQuality Assurance (QA) and Control Processes The agency can also ask vendors to describe theinternal QA process they would use on the contract. A vendor with a detailed, systematic, andthoughtful QA approach, especially one that seems to reflect its experience on other softwaredevelopment projects, will more likely deliver quality than a vendor that claims to provide qualitywork with little detail to justify the claim. The agency should consider asking vendors to describe intheir proposals: o How they ensure technical quality o How they perform configuration managementand software version control o How they monitor and control costs o How they monitor scheduleperformance and avoid late deliveries o What problems might occur in conducting the work and howthey would detect and mitigate them H. Cost Realism Finally, vendors' proposed costs, especially theunderlying direct labor costs of the proposed personnel, offer a measure of quality. A system designerpaid $50,000 per year is more likely to provide quality than one paid $20,000 per year. In addition, theagency must question the true qualifications of people likely to perform the work if the proposed keyperson in a labor category has a Ph.D. in systems engineering, but the direct labor rate for the categoryappears too low for that level of expertise. The agency should always assess cost realism of proposalsfor software development services. The Department of Labor issues wage determinations under theService Contract Act which establish minimum hourly wages for labor categories in various localities.

Page 72: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Although the primary purpose of these determinations is not to predict quality but to protect the rightsof workers, they provide a "floor price" against which to measure an offeror's labor costs. The COshould enforce cost realism. This is sometimes difficult because agencies usually conduct the cost andtechnical evaluation of proposals on separate tracks until shortly before contract award.

c. Source Selection ApproachesIn negotiated contracts, agencies usually follow one of three approaches to determine which proposalprovides the most advantageous alternative to the Government, cost and technical factors considered. oLowest Total Cost Acceptable Proposal -- This selection method relies solely on a pass/fail (or go/no-go) technical rating and total cost. Each proposal that meets all the minimum requirements is deemedtechnically acceptable. The agency awards the contract to the technically acceptable, responsibleofferor with the lowest total cost proposal. This method is appropriate when dependence on total costas the determining factor adequately ensures selection of the most advantageous proposal. It is notusually appropriate for software development services acquisitions in which technical quality is asignificant factor. o Explicit weighting of cost versus technical considerations -- The agency awards thecontract based on a formula that weights both relative cost and technical merit. For example, the SSPmay state that the award will be based 40 percent on cost considerations and 60 percent on technicalconsiderations. Usually, it is not a good idea to reveal the specific weights in the solicitation since thismay overly influence offerors' proposals. For example, if offerors knew that cost had a very lowweight, they would have little incentive to keep their prices low, and the Government might pay morethan necessary. Instead, the solicitation should state only the relative weights of cost and technicalfactors (e.g., cost is less important than technical). This method involves several disadvantages. GSAdiscourages agencies from using it and encourages them to use the greatest value or lower total costacceptable proposal methods, as appropriate. o Greatest value -- The agency awards the contract to theofferor whose proposal is most advantageous to the Government, cost and other factors considered.The solicitation states the relative importance of cost and other factors; however, it does not establishan explicit mathematical relationship among them. The agency can award the contract to an offerorother than the one with the lowest total cost, if the proposal's technical merit (considering strengths,weaknesses, and risks) justifies the higher cost. The agency must conduct a meaningful analysis tomake this tradeoff; however, no magic formula exists. The decision requires that the SSA exerciseeffective business judgment. GSA has published guidance on the greatest value method entitledImportant Considerations for Source Selection of Federal Information Processing (FIP) ResourcesUsing the Greatest Value Approach. Acquisition Steps The solicitation must describe the approach theagency will use to select the proposal that is most advantageous to the Government.

10.3 STEPS IN CONTRACTING BY NEGOTIATIONThis section describes each step in the process of competing and awarding a contract by negotiation.The steps are summarized in Figure 10-1.

a. Prepare and Issue the SolicitationA direct relationship exists between the quality of the solicitation and the quality of the offers received.Therefore, the agency should carefully prepare the solicitation to generate both maximum competitionand industry's best efforts to meet the agency's needs. This section describes the steps the agencyfollows to get the solicitation "on the street" and receive proposals. (1) Write the Solicitation The bestsolicitations result from program, Information Resources Management (IRM)/technical, andcontracting staff working together. The process can be complex and time-consuming. Agencies shouldconsider, as models, successful past solicitations used to acquire software development services. Theagency's contracting office may have them on file. Agencies can also request samples from otheragencies identified through the Commerce Business Daily (CBD) or trade journals such as GovernmentComputer News or Federal Computer Week. A. Solicitation Composition The solicitation shouldcontain all the information necessary for prospective offerors to prepare their proposals. The FARrequires that solicitations and resulting contracts use the Uniform Contract Format (UCF) as shown in

Page 73: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Figure 10-2. Solicitations are composed of 13 Sections, labelled A through M. All sections, except Land M, become part of the contract, as does the contractor's proposal. Section K, as completed by thecontractor, is incorporated into the contract by reference. B. Define Functional and Other RequirementsThe specifications and work statement contained in Section C of the solicitation describe the functionalrequirements of the software to be developed (see Chapter 5), the technical requirements of theenvironment in which the software will operate (see Chapter 9), and other requirements the softwaredevelopment services contractor must meet (see Chapter 9). The language used in Section C is moreprecise, specific, and detailed than that used in the requirements analysis document. For example,Section C must define every activity the contractor must conduct and every deliverable and otheroutput it must provide. C. Key Personnel Positions Key factors affecting the quality of softwaredevelopment services include the skills, experience, and knowledge of the contractor's personnel. As apredictor of quality, the agency typically asks offerors to provide resumes for the people proposed foruse on the contract. The agency may also ask for and check references. However, the agency needsassurance that the people whose qualifications it evaluates will work on the contract. Most offerors canprovide resumes that show considerable relevant experience, education at top schools, and so forth.The danger lies in the possibility that they might showcase these resumes in many of their proposals.The "experts" that helped the offeror win the contract may have left the company, be tied upindefinitely on another contract, or show up for the contract kickoff, then be replaced by someone withless experience, poor skills, or bad references. To guard against these "bait and switch" tactics,agencies should include key personnel clauses in the contract. These clauses usually provide that: oThe people an offeror names as key personnel in its proposal must be available at the date of award towork on the contract o Substitutions for key personnel will not be permitted for a fixed period aftercontract award (e.g., 90 days, 1 year) except in extraordinary circumstances, such as illness, death, ortermination of employment o The contractor must request substitutions in writing and obtain writtenapproval from the CO before they can take effect o The contractor must provide a resume for theproposed substitute, who must have qualifications equal to or higher than those of the key person beingreplaced Agencies may also permit offerors to designate both a primary and alternate key person foreach position with the requirement that one will work on the contract. D. Provisional Hires Sometimesofferors will propose people not currently employed by the firm, agreeing to hire these people if theyreceive the contract award. Typically, this happens when a contract requires specialized skills outsidethe offeror's normal business, or when the potential size of the development team is large compared tothe size of the offeror's overall staff. The provisional hire will probably currently work as anindependent contractor over whom the offeror has little control. Agencies should require in thesolicitation that the offeror provide with its proposal an employment agreement signed by both parties,certifying the availability of the proposed provisional hire for work at contract award. UniformContract Format (2) Develop an Independent Government Cost Estimate (IGCE) FAR Part 15 specifiesthat the Government must develop an independent projected cost for all negotiated acquisitions.Program personnel, assisted by IRM/technical personnel, usually prepare this estimate. The IGCEshould be organized by Contract Line Item Number (CLIN), as described in Section B of the RFP. Itshould also describe: o The methodology used to derive the estimate o Assumptions o Constraints oSources of or basis for the cost data The CO must approve the IGCE before the agency issues thesolicitation. (3) Obtain Industry Comments on the Draft Solicitation (Optional) The FAR givesagencies the option to provide potential offerors with a copy of the proposed requirements beforerelease of the formal solicitation by issuing a draft solicitation for information or planning purposes. ARequest for Comment (RFC) usually consists of Sections C, L, and M of the UCF, although it maycontain others. Industry comments may result in changes to the requirements or terms and conditionsthat the Government considers in its best interest, including maximizing competition. However, theagency is free to use the comments as it sees fit and is not required to respond to the vendors. The FARallows use of an RFC when necessary information is not available by more economical and less formalmeans. It may be appropriate when the acquisition involves complex requirements, high estimatedcost, or scarce resources (e.g., specialized skills). A management level higher than the CO mustapprove issuance of such a draft solicitation. Although it usually adds both time and resources to theacquisition process, issuing an RFC frequently results in clarifying requirements and may reduce thenumber of amendments to the solicitation. (4) Hold a Presolicitation Conference (Optional) The agency

Page 74: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

may hold a presolicitation conference for potential offerors to provide an overview of and answerquestions about the acquisition and upcoming solicitation. This conference is another way agencies cancommunicate with the vendor community about an acquisition without the possibility of limitingcompetition. (5) Develop Source Selection Materials The agency should prepare detailed guidance forthe SST on how to conduct and document the evaluation. This guidance provides direction forimplementing the evaluation methodology established in the SSP. It must support evaluation of boththe proposal and the offeror's ability to successfully accomplish the work. Agencies can evaluatecompetitive proposals only on the factors specified in the solicitation. As a result, agencies must ensurethat the detailed guidance, SSP, and solicitation are consistent. No Governmentwide standards exist forsource selection materials. However, agencies may establish regulations or guidance regarding theformat and content. In general, these materials should include: o General evaluation guidance --Describes the steps in the evaluation process, the organization of the SST, and cautions to evaluators(e.g., thoroughly document the evaluation, route all communication with offerors through the CO). oChecklist for ensuring compliance with mandatory requirements -- Lists each mandatory requirementfrom the solicitation with space for the evaluator to note whether it has been met and where in theofferor's proposal the requirement was addressed. o Instructions for technical rating -- Explains how torate a proposal on each evaluation factor and subfactor and identifies the standards to use. Theinstructions also identify where to look in the proposal for the information to rate. o Technical ratingmaterials -- Forms the evaluators complete to document technical ratings and the rationale behindthem. o Instructions for total cost evaluation -- Explains how to calculate the total cost. This entailscombining costs incurred by the Government [e.g., for Government- furnished equipment (GFE)] withthe prices/costs listed in the offeror's proposal. The instructions should also cover procedures forvalidating offerors' proposed prices/cost. o Total cost evaluation materials -- Forms the evaluatorscomplete that document the price/cost analyses and calculation of total cost. Preferably, the agencydevelops these materials in parallel with the solicitation and before solicitation issuance. At the latest,the agency should complete them before the date proposals are due. The evaluation materials must beconsistent with the evaluation approach and factors described in Section M of the RFP. If amendmentsto the RFP change Section M, the agency must update the source selection materials. (6) Publicize theSolicitation in the CBD The CBD is the principal means of publicizing the Government's requirements.Unless one of the exceptions in FAR Part 5 applies, the CO must announce proposed acquisitions inthe CBD, which is published up to six times each week. The notice provides a brief summary of theacquisition and tells interested vendors how they can get a copy of the solicitation. It must appear atleast 15 days before the agency issues the RFP. The notice should provide sufficient information aboutthe acquisition to enable vendors to decide if they want to respond. (7) Issue the Solicitation The COissues the solicitation to all vendors who have expressed an interest in the acquisition as well as thoseon the agency's mailing list for that type of acquisition. Agencies must, by statute, allow a minimum of30 days from the date of issuance of the solicitation for offerors to respond. In complex acquisitions,agencies often give offerors more time to prepare their proposals. (8) Hold a Preproposal Conference(Optional) For large or complex acquisitions, the agency often holds a meeting for prospective offerorsto provide an overview of the acquisition's objectives and source selection methodology and to answerquestions. At the conference, agency program, IRM/technical, and contracting personnel present theoverview, and the CO usually answers the questions submitted in writing or asked from the floor. TheCO may decline to answer at that time if enough information is not available or the agency needs moretime to consider its response. The CO would then issue written answers later. The agency mustdistribute proceedings of the meeting, including questions, answers, and a list of attendees, to allprospective offerors who requested the solicitation, even if they did not attend. (9) Answer Questionsand Amend the Solicitation During the time allowed for proposal preparation, offerors may ask theagency in writing to clarify information or specifications. The agency, through the CO, provideswritten answers to all parties who requested the solicitation. The CO may amend the solicitation,depending upon the issue(s) raised. The solicitation can be changed only by amendment and not bypublishing answers to offerors' comments or questions. The agency must also update the SSP andsource selection guidance and materials to reflect changes made by the amendments. The COestablishes a reasonable time frame for questions or requests for clarification. Beyond this, noadditional questions are accepted.

Page 75: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Evaluate ProposalsThis section briefly describes how the SST evaluates proposals. The exact sequence of events can varybased on the unique aspects of an acquisition and agency practices. An agency may conduct some ofthese activities in parallel. The acquisition process is designed to maximize competition, protectofferors' confidential business information (e.g., prices/costs, technical approaches), and ensure fair andequitable evaluation of proposals. Therefore, proposal control and security are essential. Protests haveresulted from failure to provide adequate control and security. All proposals should be handled inaccordance with agency regulations. At a minimum, proposals should be locked away when not in use.The agency should also set aside a secure area in which the evaluation team can work. The FARrequires that the CO control discussions with offerors about the solicitation and their proposals andtransmit technical or other information. Thus, all communication with offerors should be through theCO. The TEP and CEP should forward any questions or requests for clarification to the CO, who willreview and forward them to the offerors. Similarly, offerors should submit information requested bythe Government to the CO, not to members of the TEP or CEP. When communicating with offerors,the CO should not disclose any information on the number or identity of other offerors or the contentsof their proposals. This information should not be disclosed to anyone outside the SST. If additionalinformation about the acquisition or the agency's specifications must be disclosed as part of theevaluation process, the CO should ensure that all offerors receive the same information. (1) Train theSST The CO, with support from advisors (e.g., legal counsel, procurement integrity officers), usuallytrains members of the SST in the source selection process. Members should be familiar with theirresponsibilities, the evaluation procedures, security/nondisclosure requirements, the evaluationmaterials, standards of conduct, procurement integrity requirements, and conflict of interestconsiderations before beginning their evaluation. (2) Receive Proposals Offerors must deliverproposals by the designated date and time to be considered responsive to the solicitation. The agencyshould mark them with the time and date of delivery and, if requested, provide a receipt for hand-delivered proposals. Proposals received after the deadline are usually not considered unless they meetFAR Part 15 rules governing late proposals. (3) Determine Whether Proposals Comply withSolicitation Instructions The CO first checks each proposal against the format and content requirementsspecified in Section L of the solicitation. If any are not in compliance, the CO may: o Requestclarification -- The CO may ask the offeror the reason for the noncompliance, then determine if theofferor can easily correct it. If so, the CO will usually permit it. The CO does not give the proposal tothe SST for evaluation until it complies with Section L sufficiently to permit evaluation. o Eliminatethe proposal -- If the proposal is clearly technically unacceptable and cannot be made acceptablewithout a major rewrite (e.g., missing major sections), the CO must notify the offeror and eliminate theproposal from further consideration. Normally, minor omissions or deviations from format instructionsdo not constitute grounds for elimination. Evaluate Proposals Against Minimum TechnicalRequirements The CO then forwards the technical proposals to the TEP for review. The TEP firstdetermines whether each proposal meets the minimum requirements. This is a pass/fail (or go/no-go)process, usually conducted with checklists prepared as part of the SSP. An offeror whose proposalinitially fails this evaluation may be given one or more chances to modify, clarify, or correct it. If anofferor's attempts to meet the minimum requirements are unsuccessful, the CO can determine that theoffer is noncompliant, and the agency will not evaluate it further. The CO must immediately notify theofferor if this occurs and should justify and document such determinations in case of a protest. (5)Request Clarification The CO may communicate with an offeror to eliminate minor irregularities orapparent clerical mistakes or to request additional explanation of statements in the proposal. Thisrequest does not give an offeror freedom to revise its proposal in other ways. An example of aclarification would be a request that the offeror substantiate a proposal claim by referencing a technicalpublication and providing the publication to the agency. The CO should allow adequate time forofferors to prepare their responses. The SST may identify issues requiring clarification at any pointduring the evaluation of initial proposals, proposal revisions, or best and final offers (BAFOs). (6) RateTechnical Proposals The TEP evaluates proposals to determine their technical merit or value. The paneluses the factors established in the SSP and listed in Section M of the solicitation. Evaluators are usuallysupplied with instructions and rating sheets that identify the factors and standards to consider in each

Page 76: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

evaluation area. The TEP must support all ratings with narrative explanations. Sometimes agenciesinclude a technical questionnaire in the solicitation so that most, if not all, information needed fortechnical rating is together in one part of the proposal. For software development services, technicalrating usually includes checking personnel and corporate references by telephone or correspondence.The TEP can ask the CO to request additional clarifications from offerors, if necessary, to complete itsratings. (7) Conduct Initial Total Cost Evaluation Total cost evaluation usually takes place at the sametime as the technical evaluation. It includes not only the price or cost the Government will pay thecontractor, but also other evaluated costs (e.g., costs the Government will incur that might vary amongoffers). The CEP performs the necessary price or cost analysis and evaluates total cost. The CEPshould not communicate directly with the TEP during the evaluation. The CO should arrange for theCEP to obtain any information from the TEP that is necessary to conduct a thorough total cost analysisof offerors' proposals. The cost evaluation process should address cost reasonableness. This processincludes consideration of such things as assumptions and constraints offerors make in their proposals,unbalanced pricing between the early and late years of the contract, etc. (8) Establish the CompetitiveRange The CO establishes a competitive range based on a review of the technical and total costevaluation results, identifying offerors who have a reasonable chance of being awarded the contract.This is the first time a proposal's technical merit and associated costs are examined together. The COhas broad discretion to decide whether to place a proposal in the competitive range. When in doubt, theCO should include the proposal. As required by FAR Part 15, the CO notifies offerors not in thecompetitive range that their proposals are no longer eligible for award. The CO may, if stated in thesolicitation, award the contract without discussion or negotiation. FAR Part 15 identifies the conditionsupon which such an award can be made.

c. Selecting a Contractor(1) Conduct Discussions and Negotiations For offerors in the competitive range, the CO may conductindividual oral or written discussions to gather any final information essential for determining whichproposal is most advantageous to the Government. The CO may also conduct negotiations to point outdeficiencies and to provide an opportunity for an offeror to revise or modify its proposal. FAR Part 15describes the scope of discussions and negotiations. The Government must not engage in thefollowing: o Technical leveling -- Helping an offeror bring its proposal up to the level of otherproposals o Technical transfusion -- Disclosing technical information from offerors' proposals thatallows competing offerors to improve their proposals o Auction techniques -- Indicating a price or costthat must be met to obtain further consideration, advising an offeror of its relative price or coststanding, or furnishing information about other offerors' prices or costs The CO and offerors shoulddocument in writing the results of their discussions and negotiations. (2) Request BAFOs Aftercompleting all discussions and negotiations, the CO gives all offerors in the competitive range a finalopportunity to submit revised proposals. BAFOs usually focus on price or cost, although offerors mayalso change their technical proposals. However, the agency should not allow substantive changes (e.g.,a new technical approach) constituting a new proposal. BAFOs are due at a specific place, date, andtime. Because offerors may say that the availability of key personnel depends on an assumed awarddate, agencies should ask them to confirm as part of their BAFO that all key personnel are stillavailable for work on the contract. An offeror may elect not to change its proposal for BAFO. If so, itsoriginal proposal (including any revisions made to it during the evaluation or discussion processes) isconsidered its BAFO. Calling for more than one round of BAFOs on the theory that successive roundsmay reduce price is generally improper (e.g., it may constitute auctioning) and ultimately self-defeating. The CO should ordinarily conduct only one round of BAFOs. (3) Evaluate BAFOs Uponreceipt of BAFOs, the TEP evaluates all revised portions of the technical proposals, ensuring they stillmeet the solicitation's mandatory requirements and revising proposal ratings when appropriate. TheTEP documents the evaluation of BAFO technical changes, including the basis for any revised ratings.At the same time as the technical evaluation, the CO or CEP evaluates the total estimated cost, basedon BAFO prices/costs. This includes any additional price and cost analyses necessary to evaluateBAFO prices/costs. (4) Recommend the Apparent Winner Next, the CO recommends the apparentwinner by the method specified in the Solicitation and prepares a selection package containing the

Page 77: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

justification for the recommended award and the pertinent evaluation documentation (e.g., TEP andCEP forms and reports). The package-- o Provides a summary of the results of technical and total costevaluation processes, noting the relative differences among proposals, their strengths, weaknesses, andrisks in terms of the evaluation factors; and o Describes why the recommended offer is the mostadvantageous to the Government. (5) Conduct Responsibility and Legal Reviews After selection of theapparent winner but before award, the agency conducts both responsibility and legal reviews of theaward and proposed contract. A. Responsibility The CO must determine that the proposed contractor isresponsible. Responsibility, as defined in FAR Part 9, includes the following attributes: o Adequatefinancial resources, or the ability to obtain them, to perform the contract o Ability to meet the deliveryschedule o Satisfactory performance records on other contracts o Satisfactory record of integrity andbusiness ethics The CO may enlist the help of specialists from the agency or from other agencies. Forexample, the CO may request a preaward survey from a designated organization within the agency inaccordance with procedures established in FAR Part 9 and applicable agency regulations. B. LegalReview of the Contract The agency's legal counsel usually reviews the proposed contract to ensure thatit meets all Government and agency legal and regulatory requirements. (6) Select the ProspectiveContractor If the SSP provides for an SSA other than the CO, the CO usually forwards the sourceselection package to the SSA. The SSA reviews the package to ensure the evaluation of proposals hasbeen fair, conducted in accordance with the solicitation and SSP, and completely documented. Intheory, the SSA may choose to award the contract to an offeror other than the one recommended by theSSEB. In practice, however, this rarely occurs. Whether the SSA departs from or accepts the SSEB'srecommendation, the SSA must explain in writing the basis and reasons for the decision. Thedocumentation must include the relative differences among proposals and their strengths, weaknesses,and risks in terms of the evaluation factors. The justification for the award must be defensible to theofferors, the agency's oversight organizations, and the General Services Board of Contract Appeals(GSBCA) or General Accounting Office (GAO). (7) Notify Unsuccessful Offerors The FAR requiresprompt notification to each unsuccessful offeror. This notice must include: o The number of offerorssolicited o The number of proposals received o The name and address of each offeror receiving anaward o The line item prices and total price of the winning offer o The general reason(s) for notselecting the unsuccessful offeror For small business set-aside acquisitions, FAR Part 15 requires thatthe agency notify unsuccessful offerors prior to award. (8) Debrief Unsuccessful Offerors When anagency awards a contract on a basis other than price alone, FAR Part 15 requires that the CO debriefany unsuccessful offeror who requests it. The request must be in writing. The agency shouldadequately prepare for the debriefings. Members of the SST may support the CO in the debriefing,which should cover the specific strengths, weaknesses, and risks of an offeror's proposal and a generalexplanation of the evaluation methodology. The CO should not permit comparisons with otherofferors' proposals. The debriefing should not reveal any information not releasable under the Freedomof Information Act. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 11: OTHER ACQUISITION METHODSAgencies will generally make large and complex acquisitions of software development services viacontracting by negotiation, as described in Chapter 10. However, other contracting methods areavailable for smaller or more focused acquisitions. They are-- o Small purchase, o Established sourcesof supply, and o Sealed bidding.

11.1 ACQUISITION BY SMALL PURCHASEAgencies can use small purchase procedures to acquire software development services from acontractor when the total contract amount does not exceed $25,000. Federal Acquisition Regulation(FAR) Part 13 and the Federal Information Resources Management Regulation (FIRMR) contain thepolicies for using small purchase procedures.

Page 78: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

a. Purpose of Small Purchase ProceduresSmall purchase procedures were designed for the following purposes: o To lower the administrativecost of small dollar value acquisitions -- The acquisition process for small purchase procedures issimpler than contracting methods for larger acquisitions. It consists of receiving written or verbal pricequotations and awarding the contract based on this information. Because of its simplicity, a smallpurchase requires less detailed planning, documentation, and oversight than contracting methods forlarger acquisitions. This results in lower administrative costs. o To improve opportunities for smallbusinesses and small disadvantaged businesses -- Channeling purchases of $25,000 and less to smallbusinesses increases their opportunities to receive a fair proportion of Government contracts. Anagency's need for software development services may fall within the $25,000 limitation for a small,narrowly focused effort (e.g., modifications to previously developed software). However, mostrequirements for software development services will exceed this dollar limitation.

b. Using Small BusinessesSmall business-small purchase set-asides are acquisitions reserved exclusively for small businesses.Agencies must set-aside acquisitions conducted under small purchase procedures for small businesses,except for purchases made in a foreign area or from a mandatory source of supply. The ContractingOfficer (CO) conducts a survey to identify responsible small businesses that can meet the agency'sneeds. To facilitate this effort, the agency's contracting office usually maintains a small purchasesource list to ensure that qualified small businesses receive appropriate consideration. The FAR alsorecommends telephone surveys or other informal means [e.g., consulting trade magazines, contactingthe Small Business Administration (SBA)] to ascertain the availability of small businesses to meet thesoftware development services requirements. The CO reviews the results of this survey to determinethe number of vendors likely to respond. The CO proceeds with a small business-small purchase set-aside if there is a reasonable chance of obtaining quotations from two or more responsible smallbusinesses. If not, the CO can proceed with the small purchase on an unrestricted basis, using vendorswho are not small businesses. The CO must document the findings leading to this decision. If theagency's SBA procurement center representative disagrees with the CO's decision not to proceed with asmall business-small purchase set-aside, the representative may appeal the decision in accordance withFAR procedures.

c. CompetitionThe degree of competition depends on the dollar amount of the purchase. (1) Purchases of $2,500 orless Agencies may purchase software development services of $2,500 or less without obtainingcompetitive quotations, if the CO considers the price reasonable. However, the CO should distributethese purchases equally among qualified suppliers. (2) Purchases between $2,500 and $25,000 Forpurchases between $2,500 and $25,000, the CO can solicit quotations from a reasonable number ofsources to promote maximum competition. Generally, quotations from three sources will suffice. TheCO should obtain oral quotations unless it is not economical or practical. When obtaining writtenquotations, the CO should issue a Standard Form 18 for price, delivery, and other information. The COreviews this information, determines price reasonableness, and issues a purchase order.

11.2 ACQUISITION FROM ESTABLISHED SOURCES OF SUPPLYAn agency office that needs software development services may be able to take advantage of contractsthe agency or GSA already has in place. However, use of established sources is subject to the agency'sown policies and restrictions in the contract.

a. Benefits of Using Established Sources of SupplyMaking use of an existing source of supply offers several advantages over awarding a new contract: oFaster acquisition -- Because the contract for the established source has already been awarded, the timeneeded to place an order for the agency's particular need is generally shorter than for contracting by

Page 79: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

negotiation and sealed bidding. o More predictable costs -- Most established sources of supply have setrate structures (e.g., fee percentages for cost-reimbursement contracts, hourly rates for time- and-materials contracts). An agency can apply its estimate of level of effort to determine the likely totalcost of the service and use this figure for budgeting. o More predictable contractor performance -- Theoffice or agency can contact other users for information about the contractor's technical capability,responsiveness to client needs, ability to meet schedule and cost constraints, etc.

b. Available SourcesSome agencies have established indefinite delivery contracts for software development services so theirinternal organizations can obtain these services quickly. GSA has also awarded contracts for softwaredevelopment services that any agency can use. Neither of these sources is mandatory for useGovernmentwide, although an agency may have its own policies for mandatory use of its internalindefinite delivery contracts. (1) Agencywide Indefinite Delivery Contracts Some agencies acquiresoftware development services frequently enough that it makes sense to put an indefinite deliverycontract in place. Contact the agency's contracting office for information about these potential contractsand the procedures and requirements for using them. (2) GSA Contracts GSA operates the FederalInformation Systems Support Program (FISSP). The program provides separate contracts in the variousGSA Information Resources Management Service (IRMS) zones for a variety of Federal informationprocessing (FIP) support services. Federal agencies may use these contracts on a cost-reimbursablebasis by executing interagency agreements with GSA for their specific needs. GSA in turn issues taskorders to the appropriate FISSP contractor and performs all contract and task order administration.FISSP services applicable to software development include: o ADP systems analysis, definition, anddesign o ADP software development and maintenance for business, scientific, and engineeringapplications Agencies may obtain more information about the FISSP from the applicable IRMS zonelisted in Appendix D. Also, the Federal Software Management Support Center provides an automatedsoftware conversion cost model that agencies can use to estimate conversion costs.

11.3 ACQUISITION BY SEALED BIDDINGSealed bidding is a method of contracting in which the Government solicits competitive bids forrequirements stated in an Invitation for Bids (IFB). The sealed bids are opened in public, and theGovernment awards the contract to the responsible bidder whose bid is most advantageous to theGovernment as determined by price and price-related factors. In contrast to contracting by negotiation,which permits discussions with offerors, sealed bidding does not allow discussions. This acquisitionmethod generally involves less expense to the Government and the contractor than contracting bynegotiation. FAR Part 14 and the FIRMR describe sealed bidding policies and procedures in detail.FAR Part 6 requires use of sealed bidding if the following conditions exist: o The agency can publishprecise specifications for the required services or products so that discussions with responding bidderswill not be necessary o A reasonable expectation exists of receiving more than one bid o The agencycan select the successful bidder on the basis of price and price-related factors alone o Sufficient time isavailable to complete the sealed bidding process Sealed bidding is seldom appropriate for softwaredevelopment services because these services are difficult to define precisely and because theirevaluation relies in part on subjective factors (e.g., the quality predictors described in Chapter 3). Therelative technical merits of two bidders' software development service approaches would not beconsidered in a sealed bidding award because they are not price-related. A GUIDE FOR ACQUIRINGSOFTWARE DEVELOPMENT SERVICES

CHAPTER 12: CONTRACT ADMINISTRATIONAdministration of a contract officially begins at contract award and concludes when the agency acceptsthe completed software, any disputes are resolved, and the Government makes final payment to thecontractor. Therefore, planning for contract administration should begin early in the acquisition.Contract administration ensures that the Government receives the software development services inaccordance with the terms of the contract.

Page 80: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

12.1 ROLES AND RESPONSIBILITIES

a. Contracting PersonnelThe Contracting Officer (CO) has responsibility for ensuring that both the contractor and theGovernment comply with the contract terms and is ultimately responsible for all areas of contractadministration, as defined in the Federal Acquisition Regulation (FAR) Part 42. However, for morecomplex software development services, such as the development of an agencywide, mission- criticalsystem, one person rarely has the necessary expertise to monitor all areas of the contractor'sperformance. The CO, therefore, typically delegates responsibilities based on the expertise required toperform them, as shown in Figure 12-1. The delegated duties are recorded in the contract andmaintained in the contract file. Agencies may designate a Procuring Contracting Officer (PCO) forsolicitation development and contract award and an Administrative Contracting Officer (ACO) toadminister the contract. This division of responsibilities is normally reserved for large, complexacquisitions since smaller, low-cost ones would not warrant this approach. An agency's policy andprocedures will normally specify the most appropriate delegations of responsibility. In addition to theCO, most software development services contracts have at least one Contracting Officer's TechnicalRepresentative (COTR) and/or Contracting Officer's Representative (COR). The Information ResourcesManagement (IRM)/technical office usually provides the COTR, who takes a performance-monitoringrole (e.g., reviewing the contractor's design, tests, and other deliverables; answering technicalquestions). The COR usually comes from the contracting office (although the COR may come fromeither the program or IRM/technical offices) and takes an administrative role (e.g., reviewing invoices,overseeing contractor compliance with delivery/performance, maintaining contract files). Specificduties will vary from contract to contract. Figure 12-2 lists examples of CO, COTR, and CORresponsibilities.

b. IRM/Technical PersonnelIRM/technical personnel help administer the contract by providing technical expertise to the CO andare delegated authority for specific technical or administrative tasks. The COTR may be from theIRM/technical office, especially for an agencywide indefinite delivery contract for softwaredevelopment services. The extent that the CO delegates authority should be described in the contractand correspondence. Typical tasks may include maintaining day-to- day contact with the contractor,reviewing design and software deliverables or inspecting the contractor's work (e.g., attending codewalkthroughs, examining testing results, commenting on documentation).

c. Program PersonnelProgram personnel normally assist the CO and COTR to ensure that both the development process andthe software meet the requirements specified in the contract. For some software development servicecontracts, the CO may assign the COTR responsibilities to a program person. This delegation shouldbe documented in the contract file. Division of Contract Administration Tasks Typical ContractAdministration Responsibilities

12.2 PREPARING FOR THE CONTRACTORThe agency should begin preparing early for the contractor. The contract specifies the Governmentfurnished equipment (GFE), Government furnished information (GFI), and space the agency willprovide the contractor. Therefore, early preparation ensures that the contractor will have a quality start.

a. Obtaining and Preparing Space, GFE, and GFI(1) Space Agencies frequently require the software development services contractor to work on-site. Inthis case, the Government must provide sufficient space for the contractor and bear its expense.Restricted space can lessen the effectiveness of contractor personnel. When deciding on appropriate

Page 81: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

space, the Government should consider the following issues: o Type of work conducted by contractors(e.g., analysts may work less productively in a noisy environment). o Number of contractors orpersonnel involved (e.g., software development and quality assurance contractors). o Level ofinteraction between contractor and Government personnel. o Security requirements. o Size andcomplexity of equipment needed, including specific environmental requirements (e.g., climate control).o Sufficient work space (e.g., work tables for debugging, conference rooms for team meetings). oPhysical access requirements (e.g., if the contractor frequently accesses the terminal room).Government personnel should help contractor personnel follow the appropriate processes for obtainingrequired security clearances. (2) Government Furnished Equipment The scope of GFE varies,depending on the location of contractor personnel. Equipment provided as GFE for on-site personnelmay include phones, desks, office supplies, filing cabinets, etc. At either on-site or off-site locations,the Government may provide hardware (e.g., workstations, printers), development software (e.g.,programming environments and compilers), or general office automation software (e.g., wordprocessing software). The agency may need to provide machine time, terminal access, and othersupport equipment, such as copiers and printers for on-site contractors. Contractor personnel may needappropriate access identification for buildings and computer systems. The agency may considerrestricting the hours of contractor access to buildings and rooms, as well as defining contractorcomputer security access levels. The contract will describe ownership of the GFE, particularly forequipment provided to the contractor for use at its own site. Both Government and contractorpersonnel should be familiar with these terms. (3) Government Furnished Information GFI consists ofdocumentation to assist the contractor in providing software development services. This includeregulations, manuals, directives, standards, and operating instructions identified in the basic contract.Other types include information relevant to the content of the project (e.g., previous design documents,earlier studies), contract-specific procedures that the contractor must follow, and general information(e.g., agency organization charts, phone lists of assigned personnel). Contractors frequently are notrequired to return GFI at the end of the contract. If a contractor was provided any sensitive information,an agency should follow its own security requirements to ensure it remains in a controlledenvironment. For example, the agency could distribute numbered copies and require the return of allcopies at the end of the contract or provide access only within a secured area.

b. Preparing Agency Personnel to Work with the ContractorIn addition to preparing work space and appropriate environment for the contractor, the agency shouldprepare its personnel to work with the contractor. Agency personnel must be familiar with the contractscope and requirements, as well as appropriate activities for both agency and contractor personnel.Agency personnel should fully understand the scope of services provided for in the contract. Theymust know what can and cannot be done within the terms of the contract (e.g., not asking contractorpersonnel to perform personal services) and the activities they are responsible for completing,especially when a contractor's successful completion of software development depends on Governmentactions (e.g., providing comments on deliverables within a given number of days). Agency personnelshould know their own roles and the contractor's roles, the lines of authority (e.g, who is authorized todirect the contractor), and lines of communication within both the Government and the contractor'sorganization. They should also know that the COTR is responsible for technical concerns, the CO isresponsible for contractual concerns, and only the CO is authorized to make changes to the contract. Ifan unauthorized person provides incorrect direction to the contractor, that person may be liable for theconsequences. Agency personnel should know to whom they should report problems with the contract.

c. Reviewing the Contractor's Project PlanThe contractor's project plan is often an early deliverable or incorporated in the proposal. Agencypersonnel with prior experience in similar projects should examine the project plan to identify possibleconcerns and issues and, if necessary, refine it. The plan should accurately represent the project. Theproject plan should identify both contractor activities and the activities for which the contractorrequires Government resources. Areas to consider when reviewing the contractor's project plan include:o Scheduling -- Are the established time frames realistic? Can the contractor personnel realistically

Page 82: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

provide the services within the time stated? How quickly must the Government react to deliverables?Can the Government meet this schedule? To what extent would delays disrupt the contractor'sschedule? o Milestones -- Are sufficient points in the schedule identified to allow the Government toeffectively review progress, control the development process, and manage the quality of thedeliverables? o Assumptions and constraints -- Are the assumptions and constraints still valid? Arethey reasonable? o Staffing -- Do the assigned levels of staffing match the Government's estimate?How do they compare to the contractor's original proposal? Is the required mix of skills and experienceavailable? Are required key personnel provided? o Compliance with contract requirements -- Does theplan reflect requirements?

d. Developing a Tracking and Reporting SystemThe Government should develop a tracking and reporting system to ensure that the contract'srequirements are met in terms of quality, timeliness, quantity, etc. The agency could establish a manualsystem; however, automated tools exist that help track and report project activities. Several projectmanagement software and scheduling software tools are available commercially. Although thecontractor must comply fully with the contract, some activities may be particularly critical to theproject's success or particularly risky. Also, delay in activities on the critical path can delay the entireproject. A tracking and reporting system should highlight these critical activities and risks, facilitateearly identification of problems, and measure the contractor's progress. The agency should provide thecontractor a list of the requirements included in the tracking and reporting system. The contract mayrequire that the contractor aid the Government in tracking and reporting project activities or put its owntracking system in place and provide the agency with reports on both a regular schedule and an ad hocbasis. Regularly scheduled reports ensure progress continues as expected and identify problems as theyarise. Ad hoc reports ensure all anticipated problems come to the attention of the CO and COTRpromptly. For cost-reimbursement and time-and-materials (T∧M) contracts, the tracking andreporting system should facilitate verification of the contractor's costs.

12.3 POST-AWARD CONFERENCEThe relationship between an agency and its contractor has extreme importance to the success of acontract. A project initiation meeting or post-award conference helps establish that relationship. Itshould result in a mutual understanding of all administrative and performance requirements. Themeeting provides an opportunity to refine minor points, to introduce personnel and describe their rolesin the project, and to establish an effective working relationship between the Government and thecontractor. It should not become a forum for lengthy discussions concerning software developmentservices requirements. The CO, together with the COTR and/or Program Manager (PM), determinesthe need for a post-award conference if the Statement of Work (SOW) does not specify one. Figure 12-3 shows sample agenda items.

12.4 CONTRACT MONITORINGContract monitoring begins immediately after the award and ends when disputes, if any, are resolvedand the contract closed out. Its main purpose is to verify that a contractor's performance fulfills thecontract requirements in terms of-- o Work conforming to the specifications, o Satisfactory progress, oQuality standards, o Delivery or performance on time, and o Costs within the estimated range. Routinecontract monitoring can include communicating with the contractor, obtaining performance,determining awards and incentives, and closing out the contract. More complex tasks occur withchanges in the contract terms or termination. These tasks include contract modifications, disputes, andcontract terminations. Sample Agenda Items

a. Communicating with the ContractorThe contract should describe arrangements for all contract-related communications between thecontractor and agency personnel. Written status reports and periodic status briefings for the agency can

Page 83: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

be very useful and should be specified in the contract. They have particular importance in softwaredevelopment services contracts in which performance is complex and difficult to assess and significanttime may elapse between deliverables (e.g., system design, program specifications). The timing,content, and level of detail needed should be determined in advance and should correspond to the size,scope, complexity, and criticality of the contract. The contract should also define communication lineswith the contractor, such as the COTR addressing technical performance and the CO addressingquestions about contract terms or conditions. In addition, both the agency and contractor should clearlydefine the authority of each staff member. The CO approves the limits of authority for agencypersonnel, and the contract specifies the limits of authority for contractor personnel. Throughoutsoftware development services contract administration, agency representatives must not exceed theirapproved level of delegated authority. A fine line exists between technical instructions and contractualdiscussions/changes. For example, if a COTR instructs the contractor to perform extra work (e.g., addadditional software capabilities), the contractor may make a claim for additional payment. The COshould ensure that the contractor and the COTR completely understand the scope of the contract andensure that the assigned work stays within it.

b. Enforcing Key Personnel ClausesThe usefulness of key personnel clauses depends on their enforcement by the CO. The CO, with theassistance of the COTR, must monitor compliance. If the contract includes a clause allowing nosubstitutions for a fixed period, the COTR should carefully compare the contractor personnel workingon the development project to the list of key personnel proposed. The CO should also enforce leadtimes for giving notice of proposed substitutions and ensure thorough review of resumes for personsproposed as substitutes. When evaluating substitute personnel, the agency should look beyond theresume. It should ask for and check references. The agency should also consider that although the twoindividuals may be comparable on paper, the new person may not have the same understanding of andexperience with the agency's project and, thus, must spend time learning it, with the risks andinefficiencies this implies.

c. Obtaining PerformanceThe contract is the measurement device used to determine whether contractor performance meetstechnical, schedule, and cost criteria. It is built on the solicitation (including amendments) and thecontractor's final proposal. If conflicts arise in interpreting requirements for contract performance,those stated in the solicitation normally override the contractor's proposal. Exceptions includecontractor assumptions accepted by the Government before contract award. In addition, certain sectionsof the Uniform Contract Format (UCF) take precedence over others (e.g., Section F overrides SectionC). The Order of Precedence clauses required by FAR Parts 14 and 15 identify the order of precedencefor contract sections. (1) Technical Performance Generally, the CO or COTR determines whether thecontractor is meeting the contract's technical requirements in terms of quality, timeliness, quantity, etc.,by applying a tracking and reporting system. For a software development services contract, thisconsists of keeping complete and accurate records of reports, audits, walkthroughs, reviews, testresults, and deliverables. The use and scope of this system is discussed in Section 12.2. (2) CostPerformance The CO, usually with assistance from the COTR or a COR, monitors contract costs. Firm-fixed-price (FFP) contracts typically are easier to monitor than others. Under FFP contracts, thecontractor has an incentive to perform economically because every dollar of cost not incurred under thecontract price becomes additional profit. Under FFP contracts, the agency does not have to track thecontractor's costs. However, under certain other types of contracts, such as cost- reimbursement, thecontractor has less incentive to keep costs low since payment is based on costs incurred or hours spentdoing the work. Therefore, the CO and COTR/COR must track the contractor's costs to determine theirallowability and reasonableness. However, in the case of large, complex software development servicescontracts in which functionally-stated requirements may be subject to interpretation and few guidelinesare available to determine reasonableness, it could be difficult to monitor costs and determinereasonableness. If the CO or COTR/COR challenges a specific cost, the contractor must prove that it isreasonable. If contract changes occur, the agency may have to adjust contract price/cost limitations.

Page 84: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Determining equitable payments, satisfactory to both parties, has historically been a problem. In a cost-reimbursement contract without well-defined authorized reimbursable costs, problems concerningappropriate compensation may arise. No precise formula exists for calculating the exact amount of anequitable adjustment in every situation. An analysis of previous charges by the contractor and marketsources should be used to determine a fair price. (3) Schedule Performance The COTR should know ifthe contractor is performing on schedule and if its progress monitoring system is adequate. The COTRshould also know if the agency is meeting its own obligations (e.g., providing comments ondeliverables within a given number of days) so that responsibility for delays can be attributed to theappropriate party. The effort needed to monitor schedule performance depends on the size and natureof the software development services contract requirements. Software development services contractsnormally require that the contractor submit regular reports comparing actual to scheduled progress. Ifadjustments are made in the schedule and approved by the CO, the contractor should reflect them in itsreports. If a contractor repeatedly fails to perform on schedule, the agency may terminate the contractfor default. However, the CO should find out the reason for the delays before moving to defaultprocedures. If the delays are excusable, the CO should give the contractor time to correct its problems.To get back on schedule, the contractor may try using overtime and/or multiple shifts. For incentive orcost-reimbursement contracts, the CO or COR should pay close attention to overtime or premium pay.

d. Determining Awards and IncentivesContracts containing award and/or incentive provisions provide the contractor with motivation toimprove performance (e.g., cost, schedule, management, quality, expertise). The CO has responsibilityfor determining if the contractor has met the requirements necessary to receive the award or incentive.Award fee provisions (e.g., Cost-Plus-Award-Fee) require the CO's judgmental evaluation of thecontractor's performance and are not subject to appeal. Incentive fee provisions (e.g., Cost-Plus-Incentive-Fee) require objective evaluation of cost, performance, or delivery data and are subject toappeal. Chapter 9 describes the available award and incentive provisions. For Cost-Plus-Award-Feecontracts, the Government pays allowable costs, base fee, and award fee. The contractor's base fee doesnot vary with performance, but the award fee is based on the CO's evaluation of performance. ForCost-Plus-Incentive-Fee contracts, the Government pays allowable costs and an incentive fee, which isdetermined by comparing actual costs to target costs. A fee adjustment formula (share ratio) is appliedto the difference. For Fixed-Price-Incentive, the Government pays an incentive fee equal to the sum ofthe final negotiated cost and estimated final profit. If included in the contract, the agency also considersperformance and delivery targets in determining the incentive fee. For software development services,the fees become more difficult to determine as contract requirements become more complex and lessclearly defined. For example, a large, complex software development services acquisition (e.g., anagencywide, mission- critical financial system) will require maintaining a significant amount ofdocumentation to support a fee determination. The CO should ensure that the agency sets upmechanisms in advance to collect and organize the information necessary to support the award orincentive determination and that this information is collected at the appropriate time.

e. Exercising Contract OptionsA contract option gives the Government the unilateral right, at a specific time and price, to acquireadditional software development services or to extend the contract's period of performance. TheGovernment will exercise a contract option only if the following conditions are met: o Funds areavailable o The service covered by the option meets an existing Government need o Exercising theoption is the most advantageous method of fulfilling the Government's need, price and other factorsconsidered The contract specifies the time frame in which the Government can exercise options. TheCO exercises contract options in writing and includes these documents in the contract file. TheGovernment is not penalized for choosing not to exercise an option.

Page 85: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

f. Contract ModificationsModifications are authorized changes made to a contract after award. A contract should have someflexibility to allow response to an agency's changing requirements. For example, while analyzingrequirements for a new payroll system, the contractor may discover the need for an automated interfaceto the agency's central accounting system that requires substantial additional work to develop.Awarding a new contract to accommodate this requirement could delay the entire project. In this case,the existing contract might be modified to cover the additional effort. The Government's authority tomake modifications in any contract is defined in its clauses or provisions. Agencies also need to ensurethat the delegation of procurement authority for the overall acquisition covers the scope, dollar amount,and extent of competition of the modification. A contract modification may be either unilateral orbilateral. Only the CO signs unilateral modifications while both the contractor and the CO signbilateral modifications (supplemental agreements). Two potential modifications to a softwaredevelopment services contract are: (1) Administrative Changes An administrative change is a unilateralcontract change that does not affect the contractual rights of the parties (e.g., a change in the payingoffice). (2) Requirements Changes A change in contract requirements within the contract scope mayresult from changes in Government needs. Such changes usually result in modification of theGovernment's cost obligations or the performance schedule. Adding, changing, or deleting a servicespecified in the contract may require changing the contract.

g. WarrantiesA software development services contract may specify that the contractor provide warranties. Awarranty tends to foster quality performance. It establishes the rights and obligations of the contractorand the Government in the event the software is defective. A warranty also usually provides for thecorrection of defects. The Government may retain a right for the correction of defects for a statedperiod of time after acceptance of the software, or the contractor may warrant that the software will befree from programming defects for a specified period after it has been installed. If defects arediscovered during this period, the contractor will generally repair them or install a modified version ofthe program for no additional charge. The warranty period can often be extended for an annual charge.The use of warranties depends on the nature of the development services and the use of the software. Ifthe contractor agrees to provide warranties, the contract will define the warranty period. The agencyshould have an adequate administrative system for reporting software defects. The adequacy of areporting system depends upon the following factors: o Nature and complexity of the software oLocation and proposed use of the software o Difficulty in establishing existence of defects o Difficultyin tracing responsibility for the defects Government personnel responsible for monitoring performanceof software under warranty should be familiar with the following: o Nature of the software andcharacteristics warranted o Extent of the contractor's warranty, including its obligations for breach ofwarranty o Remedies available to the Government o Scope and duration of the warranty

h. Quality Assurance (QA) Deduction SchedulesSoftware development service contracts may have QA deduction schedules to discourageunsatisfactory performance. These schedules must be based on measurement against predeterminedperformance and surveillance plans.

i. ClaimsA claim is a written demand by one of the contracting parties seeking, as a matter of right, the paymentof a sum of money, the adjustment or interpretation of contract terms, or other relief. The ContractDisputes Act of 1978 (41 USC 601-613) establishes procedures and requirements for asserting andresolving claims by or against contractors. The Act also provides for the payment of interest on validcontractor claims and for a civil penalty if claims are fraudulent or based on a misrepresentation of fact.The Government's policy is to try to resolve all contractual issues by mutual agreement at the CO'slevel without litigation. If a claim cannot be settled that way, the CO prepares a written decision thatdocuments the facts pertinent to the claim, the areas of agreement and disagreement, and the CO's

Page 86: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

decision with supporting rationale. The contractor can then appeal the CO's decision to either theappropriate Board of Contract Appeals within 90 days or the U.S. Claims Court within 12 months.

j. Contract TerminationsThe two types of contract terminations are (1) termination for the convenience of the Government and(2) termination for default. The CO makes these decisions. FAR Part 49, Termination of Contracts, setsforth the policy and procedural guidelines. (1) Reasons for Termination Under the Termination forConvenience clause, the Government has the unilateral right to cancel work under a contract, in wholeor in part, whenever it determines that such action is in its best interest. This generally occurs when theGovernment no longer needs the work. When exercising this clause, the Government agrees to pay thecontractor's costs of termination, plus a reasonable profit for the work done and the preparations madeto perform the terminated portion of the contract. However, when the undelivered balance of thecontract is less than $5,000, the contract should not normally be terminated for convenience, butpermitted to run to completion. Under the Termination for Default clause, the Government has the rightto completely or partially terminate a contract because of the contractor's actual or anticipated failure toperform contractual obligations. The Government invokes termination for default when the contractorfails to meet the technical requirements of the specifications, the delivery or performance schedule, orbreaches other conditions of the contract. When exercising this clause, the Government is entitled torelief for damages suffered as a result of the contractor's failure to perform. The contractor must payadditional costs incurred by the Government in reprocuring the terminated work, including any lossesbecause of the delayed completion of the work. The Government may take over the work and completeit by awarding another contract, by using Government personnel, or by other means. (2) Procedures forTermination A. Termination for Convenience The CO must give written notice to the contractor statingthe contract clause authorizing the termination, the effective date, the extent of termination, any specialinstructions, and how the contractor may minimize the impact on its personnel if the terminationresults in a significant reduction of the work force. Upon receipt of the notice, the contractor mustcomply with the termination clause and the terms of the notice, then submit a proposal for settlement.The CO directs the actions of the contractor, reviews the contractor's settlement proposal, and promptlynegotiates a settlement. When possible, all terminations should be settled by negotiation. FAR Part 49provides specific procedures for calculating a settlement. B. Termination for Default The specificprocedures for a termination for default can vary based on the contract terms and the circumstances ofthe default. The CO should review these prior to initiating action. Usually, the CO must advise thecontractor, in writing, of the reason(s) for a default termination and allow the contractor at least 10 daysto correct or cure its failure. This notice is often referred to as the 10-day cure notice. If the timeremaining in the contract is less than 10 days, the CO will usually issue a show cause notice. Thisnotice gives the contractor a chance to show that the failure to perform resulted from causes beyond itscontrol. FAR Part 49 provides procedural guidelines for preparing both notices. If the contractor failsto cure the situation or to provide a satisfactory explanation for the delay within the 10 days, theGovernment may take termination action. If it does, the CO will issue a Notice of Termination forDefault. Upon receipt of the notice, the contractor must stop work immediately on the terminatedportion (all or part) of the contract.

k. Contract CloseoutContract closeout includes all final contract activities (e.g., ensuring completion of all requirements,making the final payment). Under normal circumstances, to close out a contract, two conditions mustexist: o The Government must have received, reviewed (or inspected, if applicable), and accepted allrequired services and deliverables in a manner that satisfies the needs of the Government, as specifiedin the contract. o After the Government is satisfied, it makes a final payment to the contractor. If thecontractor accepts final payment as such, indicating no further claims, and the Government completesits final contract audit, the contract can be closed out. Final payment is important not only because itcompletes the obligation of the Government, but also because it cuts off certain claim activities by thecontractor. After final payment with no exceptions or qualifications, the contractor gives up its rights tobring any claims against the Government. The CO should make an effort to settle any contractor claims

Page 87: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

before final payment. In addition, the CO should proceed carefully if the contractor submits a claimafter what appears to have been the final payment because the CO's actions might be interpreted as anadmission that "final payment" was not intended to be final.

l. Contract Reviews and AuditsContract reviews and audits provide a means of maintaining control, ensuring that contracts areproperly monitored and that the contractor is providing high quality software development services.Reviews and audits may vary widely in scope and complexity, ranging from tightly focused (e.g.,travel voucher reviews) to comprehensive, and may include not only financial performance, but alsocompliance with contractual terms and conditions. Successful audits generally depend on acombination of precise definition of audit objectives, rigorous methodology, and proper execution byqualified personnel. Understanding the nature and scope of work of the contract and its proceduralrequirements are essential to a successful audit. However, agencies should ensure that the audit processuses agency resources efficiently and does not unduly interfere with work on the contract. In addition,agencies should avoid techniques that unduly burden the contractor (e.g., stopping payments whileconducting the audit). Reviews are similar to audits in that they also provide a means for routineexamination of the contractor's progress. However, reviews typically occur more frequently than auditsand are less structured and extensive. Audits should be conducted in accordance with Federalregulations and auditing standards, including Governmentwide policy, guidance from the FederalAcquisition Regulation, Federal Information Resources Management Regulation, and the FederalTravel Regulation, guidance from other oversight and guidance agencies (i.e., General ServicesAdministration, Office of Management and Budget, National Institute of Standards and Technology,and Defense Contract Audit Agency) and internal agency procedures. The Comptroller Generalpublishes Government Auditing Standards for audits of Government organizations, programs,activities, and functions, and of Government funds paid to contractors, nonprofit organizations, andother nongovernmental organizations. The American Institute of Certified Public Accountantspublishes an Audit and Accounting Guide for Audits of Federal Government Contractors.

m. Importance of the COTR in Software Contract MonitoringThe role of the COTR is essential in monitoring a software development services contract. In additionto acting as a liaison between the Government and the contractor and between the user and the CO, theCOTR also determines whether the contract deliverables meet functional, technical, and performancespecifications and recommends their acceptance or rejection to the CO. The quality of the software andthe quality of the COTR's interaction are frequently related. The COTR provides progress reports toand stays in close communication with the CO, relaying any information that may affect contractualcommitments and requirements. The COTR's failure to communicate effectively could result in theGovernment having to waive certain legal and contractual rights. The COTR should immediately alertthe CO if delinquencies, unsatisfactory performance, or apparent changes in the contract occur so thatcorrections can be made before the problems escalate. A primary purpose of contract monitoring is tominimize disruptions caused by unsatisfactory contractor performance. The COTR is also responsiblefor communicating with the contractor. The COTR monitors deliverables and performance under theterms of the contract. The COTR cannot change or waive a delivery date for any reason and shouldnotify the CO of delivery or nondelivery. The COTR monitors all contract performance, not just taskswhich can be tracked by a completion date. Finally, the COTR may not modify the intent or the termsand conditions of a task order or contract. While the COTR is the technical advisor to the CO for allaspects of contract administration, the COTR must request changes to the contract through the CO. AGUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 13: SOFTWARE DEVELOPMENTThis chapter provides an overview of the stages of the software development life-cycle that start aftercontract award. This life- cycle is separate from the acquisition life-cycle, but they overlap. Thesoftware development life-cycle starts with the initial identification of a need and ends with the final

Page 88: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

disposition of the software. The software development activities in this chapter describe the classic"waterfall" or sequential approach which is the basis for a number of methodologies. Methodologiesmay differ in the specific activities conducted at various stages and the activity sequence. Manyagencies have an agency-specific life-cycle methodology that the contractor must use.

13.1 FUNCTIONAL AND DATA REQUIREMENTS ANALYSISThe agency establishes high-level software requirements in the requirements analysis, described inChapter 5, and refines them for the solicitation. In the first stage, after contract award, the contractorexpands these requirements into detailed functional and data requirements (FDR). The FDR provides atechnologically independent, detailed description of the programs, data required, and other activitiesthat the software will support. The detailed requirements form the basis for the software's detaileddesign and permit a full assessment of its benefits and costs. As a result of the FDR, the contractorprepares detailed documentation of the requirements and capabilities of the software to guide itsdesign. Some development methods call this document the Functional Description Document or thelogical design.

a. Determine Detailed Functional RequirementsDetailed functional requirements encompass all aspects of the software: processing features; softwareperformance; interfaces with existing or developmental systems; security; backup/contingency needs;and implementation, operational, and maintenance support. An agency typically distinguishes betweencritical and non-critical requirements so that it can more easily decide on tradeoffs between thosemutually exclusive or conflicting. Detailed functional requirements may include the following: oDescription of each major function supported by the proposed software -- This description includesinformation flows, logical content of reports and screen displays, content, use, volume and frequency,interfaces to other systems (e.g., processes, data, hardware, communications), manual procedures, andsecurity. o Software performance and environment requirements -- Performance and environmentrequirements include data storage (volume), response time, user interface, software flexibility (e.g.,adaptability), backup, security, and failure contingencies. o Design and development considerations --These considerations include anticipated software life span, organizational impacts of implementingthe new software (e.g., workflow, staffing levels), physical location of users, conversion/consolidationof existing systems, transition from existing to new software (e.g., training, parallel operation), new orexisting hardware, and facilities required. o Potentially excluded functional requirements -- These arefunctional requirements potentially excluded from the new software, and the identification of current orfuture projects to accommodate them. Additional issues may arise that affect requirements. Theseinclude uncertainties of program direction, changes in program policy or operation, technicallimitation, and dependencies on other systems both inside and outside the agency. Whenever possible,the agency should document these issues and how they were addressed.

b. Determine Detailed Data RequirementsDetailed data requirements also expand on the requirements identified during the requirements analysisand in the solicitation. These detailed requirements describe the data the software will generate andmaintain and their logical structure and relationships, and determine both current and new data thesoftware will use. During this analysis, the agency and the contractor will also identify, agree on, anddocument potential data requirements that fall outside the scope of the planned software. Analysis ofdata requirements includes considering the use of existing data element definitions, logical databasestructures, sources for all data, and the organizational impacts of collecting and entering data into thesoftware. Detailed data requirements may include the following: o Logical data model -- The logicaldata view focuses on information needs, and the logical data model determines the data needed tosupport the software without considering physical limitations (e.g., a buffer size that would limitphysical record length). The model includes definitions of the data entities and data elements. Dataentities are persons, places, things, and events about which the agency needs information. Dataelements are unique properties of entities. For instance, "employee" is a data entity and "employee

Page 89: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

number" is a data element. A logical data model also describes the relationships between different dataentities. o Current system data -- Current system data include data elements, data structures, databasemanagement software, documentation, and conversion requirements, if any. o Workload requirements -- Workload requirements describe the movement of data, including when, where, how much, howfrequently, and how fast the data must move. o Requirements outside of scope -- These includerequirements too complex, time consuming, risky, and/or costly to fulfill within the scope of thecurrently defined software. o Other requirements for data -- These include quality (e.g., completenessand accuracy), security, archive, retention, and audit trail requirements for all data. During thedevelopment of the FDR, the agency and/or contractor may identify unanticipated functionalrequirements that can modify data requirements. To the extent possible, these should be documented inthe FDR.

13.2 DESIGNThe design stage transforms the FDR into complete, detailed specifications for the software. Thesedescribe how the software will meet the requirements. At the detailed level, this description includesphysical databases; processing; external system interfaces; hardware, software and communicationsenvironments; manual procedures; data conversion; and user support. The FDR provides theframework for converting the FDR into the software design, then guides the work of the build stage. Inaddition, the design includes consideration of the eventual testing of the software. Differentmethodologies dictate increasingly detailed levels of design. This guide describes two levels:preliminary and detailed. A preliminary, general design determines how to build the software, and thedetailed design provides all the technical detail.

a. Preliminary DesignThe preliminary design determines, in general, how to build the software. The contractor identifiesindividual physical components at a "black box" level (i.e., without showing the software's internallogic). Some of the most significant activities of this stage include: o Identify technical environment --This activity describes the platform for the components (e.g., Federal information processing (FIP)equipment, data communications, and system software) on which the new software will run to theextent these are known. For example, the agency may know a database management system will beused, but not which one. Additional detail may be added later in the design stage. o Identify softwarecomponents -- These components include preliminary database structures, inputs, outputs, internalprocessing, and system interfaces. The contractor expands the logical data model to include additionalcontrol and edit data the software will need. The contractor should also describe any plans for reusingsoftware components and identify which ones to reuse. o Identify manual procedures -- These include ahigh-level description of non-automated procedures. o Design conversion processing -- This describesalternative procedures for converting data from existing software/databases to the new ones. oDocument new requirements -- These include any new requirements identified during this stage. If theagency's original need for software has changed, the contractor may have to modify the requirements. oDetermine operational changes -- These include operations affected by the new software, and themodifications needed to take full advantage of the software (e.g., timing/frequency of activities,location where performed, revised workflow). b. Detailed Design The detailed design stage identifiesthe technical detail. This information should provide enough detail to support the build stage.Significant activities conducted during this stage include: o Select commercial software packages --The contractor selects, with agency approval, commercial software packages as appropriate to performcertain software functions available commercially, such as database management systems and othersthat fulfill the requirements most cost-effectively. The contractor may need to modify somecommercial software or develop custom interfaces. If the contractor plans to modify the commercialsoftware, the agency should ensure that either the commercial software provider or the contractor agreeto support the modified product. o Design software -- During this activity, the contractor adds levels ofgreater detail to the physical model, which describes the "blue print" for building the software. Whilethe logical data model represents the detailed information needs, the physical model includes both the

Page 90: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

information needs and the technical environment (i.e., hardware, database management system) inwhich the software will run. To build the software, the contractor must develop detailed specificationsdescribing: oo Internal processing oo Reports, screens, and other data access capabilities, includinguser selection procedures, content, media, format (i.e., graphics and tabular reports) oo Interface(s)with existing and planned systems that will share data or resources with the new software oo Newmodules, revised processing logic or data structures oo Hardware resource allocation, communicationsaccess, security oo Physical database structures and how to implement them in the test versions of thesoftware oo Automated backup and disaster recovery procedures and mechanisms o Design anddocument manual procedures -- These provide detailed descriptions of the non-automated proceduresfor using the software and how users will perform them, including backup and disaster recovery. oDocument technical environment -- The technical environment consists of FIP equipment, datacommunications, and system software (e.g., database management software, operating systemsoftware) on which the new software will operate. The document describes their relationships andphysical distribution. o Design and document conversion processes -- These explain how existingsoftware/databases will be converted to function with or be used by the new software. The detaileddesign also describes procedures for migrating (i.e., mapping) records from the old software into thenew; performing data edit, validation, and database initialization; and handling errors. o Plan softwaretesting -- During the design stage, the contractor should plan how to test the software. Usually theagency will require the contractor to provide a test plan as a design stage deliverable. If the contractoris building an application that has already been designed, the agency may require a test plan as part ofthe proposal.

c. Design ReviewLarge software development efforts may entail a very long and complex build stage. Interim reviewswill help ensure that the development contractor addresses all requirements and follows the approveddesign. These reviews begin with the design itself and continue throughout the development process.The number, content, and timing of the interim reviews will vary, depending on the software's scope,functionality, complexity, and the methods and tools used to develop it. The following design stageactivities focus on reviewing the software design and occur periodically: o Conduct designwalkthrough(s) -- Walkthroughs ensure that a design is programmatically and technically complete.During a walkthrough, the agency's technical staff conducts one or more structured peer reviews of thesoftware design to ensure that its details are complete, meet the requirements for the software, and canbe fully implemented. o Confirm Work Breakdown Structure (WBS) for future phases -- Thecontractor provides and the agency confirms a detailed WBS for the development and implementationstages. This includes the detailed activities in each stage, timing, and linkage to the others. o Documentnew requirements -- The contractor documents any new requirements identified during the designstage. o Confirm operational changes -- The contractor confirms that the operational changes identifiedearlier in the design stage are still valid or identifies other modifications. o Confirm interaction withcurrent systems -- Confirm the current and developmental systems and databases that will interface orshare resources with the new software and determine specific modifications needed to accommodate it.These could include new modules, revised processing logic or data structures, new data elements,hardware resource allocation, communications access, and security. Confirm existing data that the newsoftware will use and how it is stored and structured. o Prepare software and acceptance test documents-- Define and document software and acceptance test procedures and coordinate preparation of testdata.

13.3 BUILDIn the build stage, the contractor develops the new software based on the approved detailed design.Although much of the activity in this stage addresses creating the software, it also puts in place thehardware, software, and communications environment, and produces the manual procedures and otherelements of the overall system. The contractor and the agency's quality assurance team conduct most ofthe build activities. Other organizations participate less intensively than during earlier stages. Some of

Page 91: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

the most significant activities of this stage include: o Confirm availability of the developmentenvironment resources -- This activity ensures the availability of hardware, data communications,system software, and applications software packages needed to conduct the development of the newsoftware. If not already available, the agency arranges for their acquisition and installation. o Developthe software -- This activity creates the software in accordance with the design. It includes writing orgenerating new code and converting or customizing other software. o Prepare and document operatingprocedures -- This activity includes preparing detailed descriptions of all procedures needed to operateand support the software, including tasks or actions that typical users and system administratorsconduct. When procedures are based on those for the existing systems, the documentation shouldinclude the modifications. o Create reference materials -- This activity includes documenting softwareprocedures and developing reference and users' manuals. It produces the materials for both the initialeducation of users, operators, and other support individuals about the software and for ongoing trainingand user support during production. o Prepare test environment -- This activity prepares resources fortesting, including the test environment, testing tools, test personnel roles and responsibilities, andfinalized test plans. It provides the starting point for the test stage. o Create database structures andload test data -- This activity includes creating the database(s) and the automated procedures to managethem. It also includes producing the structural changes or additions required to existing databases inaccordance with the software design, constructing any new database(s) needed, and loading test data.New requirements which surface during the build stage of the software may result in immediatemodifications to the design. The agency and the contractor can agree to incorporate these requirementsinto ongoing life-cycle activities, defer them to a later stage and treat them as changes or minorenhancements, or manage them as separate life-cycle efforts. When considering newly identifiedrequirements and/or suggested modifications to the design, the agency should consider their full impacton the software and the organizations that will use it. At the end of the build stage, the software isready for final testing by the contractor and acceptance testing by the agency. While build and test aredistinct stages, in practice, many activities in the test stage run parallel with and affect those in thebuild stage. For example, successful test completion determines if the software is complete. Thus, atthis point in the project schedule, the early testing activities are complete. If users or existing data willbe involved in acceptance testing, the agency may also need to conduct preparatory user training anddata conversion activities.

13.4 TESTINGThe testing stage determines that the software and related procedures developed in the build stage (e.g.,conversion, operation, user support) support agency functional and performance requirements and runerror free. While this section describes testing as a distinct stage, testing activities actually begin duringthe build stage. Also, the contractor can test some components while still building others. An effectivetesting program begins with a design philosophy that anticipates software testing requirements. Thisphilosophy structures the design and development process, simplifies the software, and makes testingmore effective. Developers of high quality software incorporate quality control and testing into theearly stages of their development process. They design the software with testing in mind. Earlydetection of errors allows the developer to correct them when they are easier to isolate and avoidsbuilding more software that incorporates them.

a. Ensure Thorough TestingThe agency should ensure thorough contractor testing. Due to software complexity and limitedresources, no testing program can test all aspects of the developed software. Only key or representativefunctions are usually tested. Chapter 14 of this Guide provides additional detail on testing andacceptance, including what to test. If serious errors surface during testing, the agency should assumethat other significant undetected errors also exist and conduct a complete investigation, beginning withthe design. In the worst case, this may result in a redesign. While software development servicesshould always include some level of contractor testing, how much depends on contract requirementsand the contractor's standard practices. Contractors have their own approaches to testing, some more

Page 92: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

rigorous than others. The solicitation and resulting contract can increase emphasis on testing byproviding for periodic reporting on the contractor's testing program.

b. Levels of TestingTest plans should address three levels of tests: unit, integration, and system tests. These tests areseparate from the Government acceptance testing or Government-witnessed testing used to determinethe acceptability of contractor-developed software. However, the contractor or agency may perform thesame tests as part of acceptance testing. (1) Unit During unit testing, sometimes referred to as programor module testing, the developer tests individual units to determine if the necessary logic is present andworks correctly. Unit testing should identify any missing functions, ensure the code performs asintended without errors, and ensure that the unit is reliable. (2) Integration Integration testing beginswhen the contractor first combines program units. These tests should confirm that the program unitslink together and interface with the files/databases correctly. In this process, integration testing mustconsider which units to integrate first, the order for testing them, and how much testing to conduct. (3)System System testing begins after integration testing is successfully completed. System testing hasthree major objectives: (1) demonstrate that the software provides specified functions; (2) demonstratethat the software meets performance, volume, backup and recovery, and security requirements; and (3)measure the quality of the software in a controlled environment. During this testing, the developermight also perform regression testing (i.e., testing to ensure changes to one part of the software do notadversely impact other areas) to measure the extent to which changes impact functionality.

13.5 READINESS REVIEWThe readiness review is the last stage of software development before acceptance testing. It provides anopportunity to review all the other components necessary to successfully implement and operate thesoftware (e.g., procedures, training, reference materials). It ensures that all components are ready foracceptance testing to avoid wasting resources due to a partial or aborted acceptance test. The followingitems are typically included in a readiness review: o Software documentation accurately describinghow the software is intended to perform -- For large software projects, this would be the FDR. Forsmaller projects, it may include design documents as well. o Up-to-date software source code listings --For software that uses compiled code, this includes evidence that the source code has been compilederror-free into the object modules (i.e., machine-readable modules) used by the computer to execute thesoftware. o Test reports demonstrating successful software performance -- At a minimum, these shouldaddress system tests. To ensure thorough testing, the agency could also consider requesting samples ofunit and integration test results. o User, computer operator, and maintenance manuals anddocumentation -- These should provide current and accurate reference material about the new software.o Current user, operator, and software maintenance procedures-- Preferably, these should include allthe procedures. At a minimum, they must include those necessary for the acceptance test. oDocumentation of successful tests for data conversion programs -- This ensures the contractor is readyto proceed if the software is accepted. o Documentation of successful conversion and library updatesfor Operation Control Language (OCL) -- The computer's operating system will use the new OCL toexecute the new software's programs. o Detailed implementation plan -- This should identify the stepsrequired, their dependencies and sequence, the resources required, and contingency plans if theacceptance testing does not proceed as planned. Usually the contractor provides these items. However,depending on the terms and scope of the contract, the agency may be responsible in part or entirely forsome of them. At the end of the readiness review, the agency decides if the software's acceptancetesting should proceed. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 14: SOFTWARE TESTING AND ACCEPTANCESoftware testing is an essential part of the development process. It minimizes the impact and cost oferrors by finding and correcting them before the system becomes operational. Software testing ensuresthat the software meets contract requirements and specifications and forms the basis for acceptance.Software testing should be performed throughout the development and implementation life-cycle,

Page 93: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

beginning during the design phase, continuing into the build and test stages, and continuing throughthe implementation stages. Software testing activities can include verifying a program, unit, module, orsystem against both functional and performance specifications. Testing also determines that thesoftware uses hardware resources efficiently, adheres to standards properly, and can result in effectiveutilization by its intended users.

14.1 SOFTWARE QUALITY ASSURANCE (QA)Software QA is a planned and systematic methodology to ensure that the software meets contractrequirements and the software performs as intended. Software testing is part of the Government'soverall QA effort. Agencies and contractors conduct QA activities throughout the softwaredevelopment life-cycle, beginning with requirements validation and continuing throughoutdevelopment and maintenance. Earlier chapters cover those QA activities which occur in the earlystages of the acquisition life-cycle. Software quality assurance typically comprises three sets of factors.o Functionality -- Does the software do what it is intended to do? Does it perform all the specifiedfunctions correctly and reliably? o Engineering -- How well is the software built? How efficient is thelogic and coding? How easily can the system be tested? How comprehensive and accurate is thedocumentation? o Adaptability -- How easy is the system to maintain or expand? As many as threedifferent organizations may have roles in software QA: o The development contractor o The agency oA support contractor

a. Development ContractorThe development contractor has the primary responsibility to develop, demonstrate, and ensure thequality of the software. The contractor's methodology, along with the size and/or complexity of thesoftware development services acquisition, usually determines the contractor's QA activities. Typicallythey include: o Reviews -- Reviews help to identify defects as early in the development process aspossible. Reviews involve activities associated with walkthroughs (i.e., structured meetings duringwhich a peer group evaluates a product) of requirements, design phase documents, and the programcode. o Testing -- Testing, by manual or automated means, determines if the software satisfiesspecified functional and performance requirements. Typical methods, described in detail in Section14.5, include benchmarking, black and white box testing, and testing with real or fabricated data. oSoftware configuration management -- Software configuration management ensures that changes toboth software products and development processes comply with established standards, practices, andprocedures. It includes naming, tracking, and recording each software component, database, and othernecessary elements (e.g., documentation, procedures), controlling changes through agreement ofspecifically designated members of the project, and maintaining a history. The history documents allchanges during the development and maintenance of the software, when they occurred, and the impactsof the changes. o Audits -- Audits determine whether the software testing and acceptance processes areproceeding logically and in conformance with requirements. Organizational elements conduct auditsindependently of the development team. They review software development and maintenanceactivities. The agency should request documented results of the contractor's audits.

b. AgencyThe agency should perform QA processes in addition to those the development contractor performs.However, the agency performs some testing in conjunction with the contractor's internal testingprocess. The agency can accomplish software design reviews by design walkthroughs, stand-aloneanalyses of design documentation, or a combination of both. The design's size and complexity shoulddetermine the method(s) employed. Agency software quality assurance can include testing, reviews,and audits similar to those performed by the development contractor, but with agency-developed testdata. The agency should coordinate testing with the contractor. Typical areas of coordination includescheduling, testing approach, tools used, generation and use of test data, monitoring and collection ofresults, problem resolution, etc. The methodology the agency employs depends upon the testing phaseand approach. The agency should coordinate audits with the contractor to ensure the schedule is

Page 94: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

appropriate, audit materials (e.g., listings, test results, documentation, etc.) are provided as necessary,and agency and contractor personnel are available as required. Agency personnel or a supportcontractor should perform the audit. The development contractor should review and address the auditfindings.

c. Support ContractorA third organization, a support contractor, may assist the Government in assuring the quality of thesoftware development. The support contractor augments the agency's staff by providing additionalresources and expertise. This contractor's evaluation and testing responsibilities and methodologiesconsist of performing QA functions as an adjunct to those performed by the agency and thedevelopment contractor. The agency tasks the support contractor to perform independent verificationand validation (IV∧V) of the software programs and system. Verification consists of reviewing,testing, and auditing the software during its development. Validation consists of evaluating thesoftware at the end of the development process to ensure compliance with requirements. TheIV∧V contractor reports results to the agency, which retains full responsibility and authority fordirecting the development contractor's activities and approving the software development products(i.e., design, test results, documentation, etc.).

14.2 DETERMINING TEST OBJECTIVESA major principle of software development testing is that test objectives must be identified. Theyestablish an effective process for testing the quality of the software development and improving bothits effectiveness and efficiency. Test objectives should: o Provide an infrastructure -- Establish a testteam, assign roles and responsibilities to organizations/individuals for specific testing activities,identify the testing environment and necessary tools, identify the source(s) of test data, establishprocedures for evaluating test results, and establish change control procedures. Section 14.6 discusseshow to determine the resources for testing. o Define a workable strategy -- Identify the scope of thetests and the functions to be tested, determine the level of assurance required, establish a testingapproach, and set a schedule. o Develop the test methodology -- Define the aspects of the software thatthe contractor will test, the level of detail, test conditions, and the testing process. Section 14.3discusses how to establish the test methodology.

14.3 DECIDING WHAT TO TESTThe initial step in deciding what to test is to designate a team with overall responsibility for the testingprocess. It can include both agency and contractor personnel. The test team's initial goal is to designthe test strategy to ensure that the software meets agency requirements as specified in the contract. Theteam must-- o Decide what to test o Decide how to test o Determine available resources o Decide whento test o Prepare test packages The test team must define the boundaries of the tests. It is usually notpractical nor possible to test all features of the software because the time and effort required areexcessive for the benefits derived. Functional testing, therefore, should focus on the features mostimportant to the agency. However, the team should test most, if not all, performance requirements(e.g., transaction speed, file update speed). The logical starting point is to test the most criticalrequirements. The test team should then identify other requirements that can reasonably be tested,given the resources available. System characteristics commonly tested include: o Accuracy --Assurance that the data entered, processed, and output by the system are correct and complete oIntegrity -- Assurance that the data are processed, stored, and retrieved accurately o Auditability --Assurance that the system can substantiate that processing has occurred o Reliability -- Assurance thatthe system will perform its functions with the required precision over an extended period of time andhave the ability to continue processing in the event problems occur o Performance -- Assurance that thecorrect results are available within the acceptable time frame and that the amount of hardware resourcesand code required by the software to perform its functions is acceptable o Security -- Assurance that thesoftware and data will be protected against accidental or intentional modification, destruction, misuse,or disclosure o Compliance -- Assurance that the software is designed in accordance with agreed upon

Page 95: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

methodology, policies, procedures, and standards o Maintainability -- Assurance that the effortrequired to locate and fix an error is acceptable o Integration -- Assurance that the software willinterface with other systems

14.4 TYPES OF TESTINGSoftware development testing should cover the entire development cycle. The types of testing includedesign testing and software testing.

a. Design TestingConduct design testing (sometimes called design reviews) during the requirements definition, generaldesign, detailed design, and build phases of the software development life cycle. o Requirementsdefinition testing -- Ensures that the development addresses all requirements and determines if any canbe simplified or eliminated. For example, the agency may simplify or eliminate requirements bymodifying the manual procedures for interfacing with the software. o General design testing --Determines whether the design provides the most effective and/or efficient solution, given thesoftware's requirements. o Detail design testing -- Determines whether the software meetsrequirements. During this process, the test team can map the requirements to the detailed design toensure the developer has addressed all of them. o System development design testing -- Verifieswhether the software meets all the requirements during the development phases.

b. Software TestingPerform software testing as part of the build and test stages. Software testing consists of: o Unit testing-- This tests the individual units (e.g., programs, modules) to determine if the necessary logic is presentand works correctly. Unit testing should identify any missing functions, ensure the code performs asintended without errors, and ensure the unit is reliable. o Integration Testing -- This function beginswhen the contractor first combines program units. During this process, the agency should confirm thatthe program units link and interface with the files/databases correctly. Integration testing must considerwhich units to integrate first, how much testing to conduct, and in what order. o System Testing -- Thistesting begins after integration testing is successfully completed. System testing has three objectives:(1) to demonstrate the availability of specified functions, (2) to demonstrate compliance withperformance requirements, and (3) to measure the quality of the software in a controlled environment.o Acceptance testing -- Initiated after successful system testing, the agency should perform acceptancetesting to confirm that the software is ready for operational use. It is usually performed after thesoftware has been installed in the production environment and should focus on user requirements. Theagency, therefore, should involve the users as much as possible. These tests should also addressinstallation, security, recovery, and operational requirements.

14.5 CHOOSING AMONG TEST METHODSThe contractor and agency can employ many test methods, including benchmarking, black and/orwhite box testing, and testing with real or fabricated data. Most testing strategies use a combination ofthese methods to achieve the desired results.

a. Benchmark TestingBenchmark tests use a sample set of data or transactions to check software performance againstpredetermined parameters to ensure that it meets requirements. It is particularly appropriate for large,complex software systems. Benchmark programs present data to the software and check the accuracyof basic calculations, the timing and accuracy of data storage and retrieval functions, and file andmemory limitations. Custom benchmarks may be developed to support testing. Development,however, requires significant resources. As a result, an agency should consider using this approachcarefully and ensure the projected benefits are worth the development effort.

Page 96: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

b. Black Box TestingBlack box testing, also called functional testing, verifies that the software functions perform correctly.It examines the software from the user's point of view and determines if the data are processedaccording to the specifications. Because this type of testing does not consider internal programoperation or structure, it can result in the implementation of inefficient code if not combined with whitebox testing.

c. White Box TestingWhite box testing examines the internal structure of the software. This type of test examines thesoftware's structure and logic and develops test data to determine if the logic paths correctly performthe functions required. White box testing requires skilled individuals to examine the logic anddetermine the circumstances which will require the software to execute the target logic paths. Mostwhite box testing is for individual components (e.g., program, module), not the system as a whole.

d. Testing with Fabricated DataThe testing organization generates fabricated data specifically to test software. The organization canuse these data either to test the software from a user's perspective (i.e., black box testing) or to examinethe software's internal logic (i.e., white box testing). Because the organization can develop specific setsof data to test specific logic, fabricated data allow testing of the functional/logical operations of thesoftware. However, fabricated data can require significant resources to develop. One method to controlresources is to develop a baseline of data and augment it to examine additional logical or functionalattributes of the software. Another is to consider using test data generators.

e. Testing with Real DataReal data tests the software without regard to its functions or logic paths or the data's type, structure, orcontent. As a result, real data may not test all the functions or logic paths desired and produceinconclusive results. Preparing and using real data for testing does not require knowledge of theinternal logic of the software, usually requires fewer resources, and is more appropriate forperformance and stress testing than fabricated data. However, identifying problems in the software mayrequire more resources than using fabricated data because individuals must examine the real data forvalidity and accuracy, determine appropriate results, and examine the results of processing the data tovalidate that they were processed correctly. The testing organization obtains real data from the samesources and in the same format as the data that the new software will process when it becomesoperational. The organization can obtain real data from one or more of the following: o Existingsoftware that will interface with the new software o Conversion of data from existing software that thenew software will replace o Data in manual form for input to the new software

14.6 DETERMINING RESOURCES FOR TESTINGAgencies should determine in advance the resources available for software testing (i.e., personnel,time, and equipment). Successful testing requires adequate resources consistent with the range offeatures that the agency will test and the testing techniques planned. The agency needs sufficientpersonnel from program and IRM/technical organizations available during the time allotted to conductthe tests and evaluate the results. The team must include personnel with skills appropriate forconducting the tests and seniority appropriate to maintain momentum and proper focus on the agencygoals. The team should know the agency's funding commitments for testing equipment, personnel, andother test services (e.g., software performance monitoring systems and analyzers) to ensure sufficientavailability and approval before testing starts.

Page 97: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

14.7 ACCEPTING THE SOFTWARE AFTER SUCCESSFULTESTINGAcceptance is the Government's acknowledgement that the software fully meets contract requirements.This is the Contracting Officer's (CO's) responsibility and occurs at or after installation, depending oncontract terms. These terms should specify how and when the Government will notify the contractor ofacceptance. They may require the testing team to recommend acceptance to the CO based on testresults. The CO must determine that the contractor has fulfilled necessary contractual obligationsbefore accepting the software. The CO should not accept the software if the development contractorhas not met the contract requirements. If the contractor proposes to correct problems at a later date, theagency should defer acceptance accordingly. The CO and the testing team should be aware ofconditions that may imply acceptance prematurely and inadvertently provide a basis for the contractorto claim the software was accepted. For example, using it in production before the CO officiallyaccepts it might imply acceptance. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENTSERVICES

CHAPTER 15: SOFTWARE IMPLEMENTATIONSoftware implementation includes all activities required to make the software operational. They caninclude conversion, acceptance tests, pilots, rollout (i.e., distribution) strategies, and training. Agenciesusually assign responsibility for overseeing implementation to Information Resources Management(IRM)/technical personnel, who have the requisite software expertise. The users support theseindividuals to ensure the software meets both functional and performance requirements. The agencyshould have implementation plans in place for use throughout the software development life cycle. Amajor part of implementation is the testing required before acceptance. During acceptance testing, theGovernment fully examines the newly developed software to determine that it conforms to contractrequirements. After this determination, the Government accepts the software (i.e., assumes ownership).Chapter 14 provides information on testing.

15.1 IMPLEMENTATIONDuring implementation, the agency installs the software into the production environment and preparesusers to use it.

a. Implementation StepsImplementation involves simultaneous performance of different but interrelated activities. Formalplanning, with conscientious follow-through, ensures that implementation progresses smoothly.Implementation activities occur in three phases: o Pre-installation activities -- These take place betweensoftware design and acceptance for installation in the production environment. o Installation activities -- These begin after the software passes testing and is ready for acceptance and installation in theproduction environment. The agency should ideally include a phased installation in its plan. o Post-installation activities -- These begin after the software is accepted and installed in the productionenvironment. During this step, the agency starts using the software to support its mission andfunctions. The sequence, timing, and complexity of activities in each phase depend on many factors,such as complexity of the implementation strategy, mission criticality, and complexity of the software.For example, implementation of a centralized software application for one user group is usually easierthan implementation of software applications in a distributed environment consisting of many locationsacross the country. (1) Pre-installation Pre-installation activities focus on planning the implementationto minimize its effect on current processing or user functions. Software installation planning should becompleted prior to beginning installation activities. The agency should make detailed plans for thefollowing: o Cutover approach -- Determine how the agency will migrate from the old method ofperforming the functions to use of the new software. o Site Preparation -- Arrange for testing andinstalling any new hardware before the software installation. Arrange for reconfiguration of theworkplace, if necessary. o Training -- Train operations and system support personnel [e.g., software

Page 98: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

maintenance staff, database administrators, and input/output (I/O) control]. o Disaster Recovery --Develop a plan for backing out of the new software in case the installation fails (e.g., returning to theold process). o Acceptance Testing -- Develop a plan for conducting acceptance testing, includingmethodology, tools, procedures, personnel, etc. o Reducing Risks -- Identify implementation risks, thesteps taken to mitigate their possible occurrence, and contingency plans. Some implementationstrategies install the software in stages. For this situation, one-time pre-installation activities will takeplace before any software arrives, then a smaller set of activities will occur before each subsequentdelivery. For example, even for software installed at multiple sites, determining the best cutoverapproach needs to take place only once. However, for each installation, the agency may need to planand schedule testing and data conversion. Before the agency starts installation, it should ensure pre-installation activities have been successfully completed. (2) Installation Installation covers thesoftware's transition from a test environment to a production environment. During this stage, theagency conducts an operational readiness review. This ensures that users have received sufficienttraining, correct procedures and manuals are in place, data have been converted as applicable, andoperations personnel have been trained. When the review is successfully completed, installation of thesoftware into the production environment can begin. The agency should schedule installation during aslow, non-critical work period to minimize disruption. This is usually at night or over a long weekend.Installation activities may place high demands on system resources (personnel, hardware). Theschedule should allow personnel to devote the necessary time to installing the new software withoutimpacting current production work. Installation usually involves transferring the software from itstesting environment, which may be at the developer's site or at an agency's location, into either a pre-acceptance or production library. The agency then tests the software's fundamental capabilities toensure the installation was successful (i.e., all basic functions exist, required files are accessible,software/hardware interface is compatible, users can make appropriate access). The agency thenconducts acceptance tests according to the plan developed during pre-installation activities. A carefullydeveloped plan can simplify and guide installation testing. The acceptance test team conducts the tests,then documents and relays problems to the contractor for correction. To ensure that problems arecorrected, the team repeats the tests. This continues, possibly with several cycles, until the softwarereaches an acceptable level of performance and reliability. (3) Post-installation When the installation iscomplete, the agency may want to collect user feedback about the software. This feedback can help toimprove the effectiveness of the software by identifying additional requirements, training, ordocumentation needs. Sample Activities of Software Implementation

b. Implementation ConsiderationsThe agency needs to consider and plan for numerous activities associated with the implementationprocess. Figure 15-1 provides a synopsis of common activities. Each activity is discussed below,grouped by topic area. (1) Cutover Approach Selecting the implementation cutover approach dependson the nature of the new software and the tradeoffs involved in the installation alternatives (i.e., cost,time, resources, risks). The approach selected influences the entire implementation process andprovides a framework around which to plan the other implementation activities. Figure 15-2summarizes the advantages and disadvantages of each approach. The alternatives are: o AbruptCutover -- This approach involves simultaneous dismantling of the old software and start-up of thenew. At a predetermined time, use of the old software stops and the new software performs therequisite functions. This approach costs the least; however, it has the highest risk. If the new softwarefails, fallback to the old software may be difficult, if not impossible. In addition, this approach requiresa high level of coordination since software, users, data, and support all must be ready at the same time.o Version Installation -- Version installation involves dividing the installation into a series ofincrements. The contractor implements a basic set of capabilities in the first increment and addsadditional capabilities in subsequent increments. Each increment goes through a completeimplementation cycle. This approach is normally reserved for large software development projects withdiscrete, easily identifiable increments. For instance, if one group interfaces with a particularsubsystem, implement that subsystem separately. o Parallel Operation with Single Cutover Point -- Inthis approach, both old and new software operate concurrently for a period of time. During this period,

Page 99: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

all input transactions update the files that support both systems. If problems arise in the start-up of thenew software, the agency simply switches back to the old. This approach also provides an opportunityto test and correct the new software. However, operating both systems concurrently can be impractical,and costly. In addition, it limits improvements or changes to the software. If the agency's activities aredynamic or subject to frequent legislative changes, this can present problems. o Parallel Operation withGradual Shift from the Old Software to the New -- In this approach, both software systems operateconcurrently, with the old software discontinued gradually. Like version installation, this approach isgood if the agency can easily segment the users. This approach minimizes operational risks. Theagency can discontinue the old software as quickly or as slowly as it becomes comfortable with thenew software. However, the cost of operating both systems concurrently is high, with the chance ofconfusion among users about which system to use. If this cutover approach is not carefully planned, anagency could find itself straddling two systems. Advantages and Disadvantages of Cutover Approaches(2) Site Preparation Installing new software may require site preparation, especially if the agencyconducts a concurrent activity, such as acquiring a new hardware platform. Because of time lags, mostsite preparation activities should start as soon as possible during the pre- installation. The agency mustcomplete most site preparation activities before installation can begin. Site preparation activitiesinclude: o Preparing the hardware environment o Ordering new supplies o Reconfiguring theworkplace for the new software (e.g., installing new or upgrading the system software to permit accessto the new software, new workstations) (3) Data Conversion Data conversion is a key concern whenimplementing new software. It is a time-consuming and resource-intensive part of installation. Theagency or contractor must convert relevant data into the form required by the new software and mayalso need to cleanup old data. Data conversion plans are usually based on the requirements identified inthe analysis of alternatives and should be completed as early as possible during the softwaredevelopment life-cycle. This permits converting data during the development phase. Conversionstrategies vary with the complexity of the new software and the cutover method used. Typically,conversion problems center around supporting data for both systems or parts of both systemsconcurrently. The conversion process includes the following: o Determine conversion needs for-- ooExisting computer files oo Existing manual files o Assign conversion roles and responsibilities oSchedule the conversion o Develop and test conversion programs o Run conversion program o Prepareinformation currently in paper form for data entry into the new software o Enter manual data o Checknew files for format and accuracy (4) Interface Programs The agency should examine existingcomputer programs which interface with the new software to determine if they require modification.Modifications may include changes to current data definitions, processing additional data, developingsoftware to interface with additional functionality in the new software, etc. All identified modificationsto the current computer programs should be scheduled and implemented as part of the conversionprocess. (5) Procedural Changes Implementation of new software often requires changes inorganizational procedures, particularly for large and complex applications. The agency shoulddetermine changes in procedures and polices before implementing the new software and reviseorganizational procedure manuals accordingly. For example, implementation of new financialmanagement software may require the agency to institute new reporting schedules. The agency shouldobtain any required approvals for the changes early enough so that the installation is not delayed. Theagency should also identify changes in job assignments and notify employees as early as possible. (6)Documentation Agencies should check documentation provided with the software to determine if it-- oDescribes the policies and procedures o Is written at an appropriate level for intended readers oDiscusses commands and features that relate to application tasks o Contains indices and cross-references for easy access to information (7) Preparing the Users Introducing new software into theagency needs careful and diplomatic management. Some of the most persistent problems result fromusers' reactions to a new software application and the procedural changes it requires. Involve usersthroughout the implementation process. They can provide valuable input on day-to-day procedures thatwill help the agency select the most effective implementation approach. If users see that personnelimplementing the software listen to them, they will be less likely to feel that the new software has beenarbitrarily imposed on them. In turn, they will more readily accept and support it. The agency shouldimplement a formal mechanism for user comments for an indefinite period or at least until they arecomfortable and content with the software. (8) User Training User training is particularly important

Page 100: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

when implementing new software. Users will work directly with the application and need to understandits functions, interfaces, and processes. During implementation, the agency and the contractor trainmost of the users. Training may include hands-on courses, automated instruction, and reading thedocumentation. The agency should schedule training to minimize disruption of regular work. Ifpossible, some training should begin during pre-installation, especially for personnel required to assistin the installation or who need to use the software immediately. The agency might consider involvingusers in the acceptance test process so they can obtain early training and experience. After installation,users can begin hands-on training using test data. The agency should evaluate the training when it isfirst implemented to ensure it effectively and efficiently meets user requirements. The agency shouldcontinue this evaluation throughout the training period. This will help determine and maintain theeffectiveness of the training. (9) Back-up Procedures The agency and/or contractor should developprocedures for contingencies such as computer failure, data loss, and unanticipated software problems.At the minimum, this includes copying computer records onto tape or disk so that the agency canreconstruct current information, if necessary. The agency should document the frequency of data back-ups, the storage location of back-up records, and the personnel responsible for performing back-ups. Ifthe agency requires continuous operations, it should develop or ensure that the contractor developsstandby manual procedures in case of system failure. (10) Security The agency and/or contractor needsto implement security procedures and standards to prevent unauthorized or inadvertent access to ordestruction of information. These should include the following areas: o Physical security -- Thesecurity of hardware, software, and supplies o Software access -- Person or group allowed to obtain anduse a software program o File access schemes -- Person or group allowed to read, copy, and edit a file oCommunications -- Security of information during transmission to and from remote locations via acommunications network A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

CHAPTER 16: SOFTWARE OPERATION ANDMAINTENANCEUpon implementing new software, the agency must ensure proper operation and maintenance. Theagency must also manage enhancements. Software operation refers to routine activities that make thesoftware perform its intended functions. Software maintenance refers to activities that modify thesoftware to keep it performing satisfactorily. Software enhancements refer to significant functional orperformance improvements to the software. The boundary between maintenance and enhancement isgenerally one of degree. Enhancement efforts often represent larger versions of "perfectivemaintenance" activities as defined later in this chapter. However, modifications that deviatesignificantly from the application's current design are widely recognized as enhancements. This chapterdescribes these post-implementation activities.

16.1 SOFTWARE OPERATION AND SUPPORTSoftware operation and support involve activities that allow the software and its users to performintended functions acceptably. These include: o On-going support services, such as user support,documentation, and training o Operational support, such as data entry, disaster recovery/backup,performance tuning, and configuration management Although any of these can be done under contract,agencies often provide their own operational support. Since custom-developed software has manyunique aspects (e.g., operational concept, performance parameters), performance monitoring andcontinued configuration management have particular significance. They provide important statusinformation for maintenance and enhancement activities (e.g., software performance, planned changes).The agency must develop both the standards of comparison for acceptable performance and themeasures to use in comparing the software to them. For commercial off-the-shelf (COTS) software, thevendor has already developed these and only the unique aspects of the agency's installation (e.g., diskspace availability) require consideration.

Page 101: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

a. Performance MonitoringFor software that uses shared resources (e.g., a local area network, a mainframe processor, or fileserver), the agency must measure whether the new software is using more than the expected amount ofthe resource. This requires setting performance standards and then taking measurements to compareagainst them. Agencies should consider the following when establishing performance standards andmeasurements: o Number of input/output (I/O) reads and writes o Memory paging activity o Centralprocessing unit (CPU) cycles used o Response time for the user o Transaction processing elapsed timeo Communications characters transmitted and received o Communications channel utilization Whenthe software deviates from the established standards, the agency knows a problem exists. However, itmay require further analysis to determine if the performance problem can be resolved operationally(e.g., resolving disk contention by spreading the software's files among more disks) or if the softwareneeds modification.

b. Continued Configuration ManagementA software configuration is an arrangement of software parts, including all elements necessary for thenew software to work. This includes the new software's modules, Operational Control Language(OCL), code libraries, databases, and other software the application relies on, such as the operatingsystem, language compilers, database management systems, and the relationships among them.Configuration management refers to the process of identifying and documenting the configuration andthen systematically controlling changes to it to maintain its integrity (i.e., ensuring the pieces willcontinue to work together correctly) and to trace configuration changes to agency requirements. Itincludes controlling the impact the changed software has on other software and managingimplementation of the changes. This process does not need to wait for implementation. The agency canand should begin tracking the components as the software development proceeds. Configurationmanagement provides an accurate view of the current software status, as well as anticipated changes. AConfiguration Control Board (CCB) identifies and controls the software configuration. The CCB is apanel of senior managers and technical staff that approves and controls changes to the software andwhen they are made. If a CCB already exists to support other FIP resources, the agency can add thenew software to the Board's responsibilities. If a CCB does not exist, the agency should establish one.A separate CCB may also be necessary for large efforts during which many changes are expected. TheCCB should ensure a well-documented baseline software configuration and then control changes, at thesame time ensuring the change control process does not inhibit responsiveness to the software users'needs. The software developer may also have an internal configuration control process. The agencyshould work with the developer to coordinate these processes to make them as efficient as possiblewithout reducing their effectiveness. Including the contractor on the CCB would help ensurecoordination; however, the agency should retain direct control of the CCB since it serves a vital controlfunction. Configuration management provides a valuable baseline for controlling maintenance andenhancement activity. Before an agency can accurately assess the impact of a proposed change,whether software maintenance or enhancement, it needs to know the status of its current configurationand how the changed component will affect other components. Configuration management typicallyhas four major functions: o Configuration identification -- Determines what components to include inthe configuration and develops unique identifiers for tracking individual components and adding newones. o Configuration control -- Imposes discipline on the change process to ensure that items changedor added to the configuration complete all the necessary testing and approval steps before inclusion.This includes assessments of probable impacts on other components of the configuration, as well as onthe automated data processing (ADP) environment. o Configuration requirements tracing -- Ensuresthat the configuration changes are traceable back to user requirements either directly (e.g., a user-requested change) or indirectly (e.g., better user support through improved system performance). oConfiguration status accounting -- Reports the current status of components in the configuration aswell as components undergoing change or about to be added.

Page 102: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

16.2 SOFTWARE MAINTENANCESoftware maintenance consumes a large portion of the funding committed to software; for manyorganizations, more than half. The agency should establish how it will perform software maintenanceand coordinate this with its configuration management functions. Maintenance activities generallybegin with a problem report or requested change that identifies the software improvement needed.These activities normally fall into one of three broad categories: o Corrective maintenance -- Correctserrors in the software that generate inaccurate or unreliable results. Many people think of this categorywhen referring to maintenance. However, it is less common than the other categories and becomes lesslikely as software matures and initial errors are discovered and corrected. o Perfective maintenance --Refines the software to improve its function based on user- or programmer-initiated change requests.This is the most common category of maintenance. In part, perfective maintenance is needed becausethe methods of identifying and documenting user requirements are imperfect and so leave some marginbetween the defined requirements and the users' intent. On the other hand, user requirements also tendto evolve as users learn more about the software and develop a clearer understanding of what it shoulddo. o Adaptive maintenance -- Adapts the software to changes in its environment, such as an operatingsystem upgrade. Many different components of the ADP environment undergo transitionssimultaneously. As a result, even when the software itself is stable, it may require maintenance tocorrespond with changes to underlying software or hardware. Software maintenance includes morethan the correction of software errors. Most maintenance responds to user requests for functionalimprovements. These can vary from very minor modifications to extensive revisions.

16.3 SOFTWARE MODIFICATION/ENHANCEMENTSoftware modification/enhancement constitutes another important aspect of post-implementationsupport, but it differs from maintenance primarily in degree, either in terms of level of effort thechanges will require or the magnitude of the variance from the current software design. Agencies mustconsider the integrity of the software design when performing enhancements. Successiveenhancements may erode the integrity of the design. Users often evaluate enhancements for theirimmediate impact, but may not examine them in the context of the overall software design (i.e., wasthe software intended to perform this function?). In forcing the software to perform functions notincorporated in its original design, the programmers may make expedient compromises requiringcreative solutions. As a result, the software becomes increasingly difficult to maintain and modify. Formajor modifications, agencies should validate that the original software design can support themodifications. If not, take the time to modify the basic design to accommodate the required changes.This integrates enhancements into the design on an ongoing basis and prolongs the software's usefullife. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

APPENDIX A: GLOSSARY AND ACRONYMS

GLOSSARY OF TERMSAcceptance: The act of an authorized representative by which the Government, for itself or as an agentof another, assumes ownership of supplies tendered or approves specific services rendered as partial orcomplete performance of the contract. Acceptance Testing: Testing to determine whether productsmeet the requirements specified in the contract before the Government accepts them. Acquisition: Theacquiring by contract with appropriated funds of supplies or services (including construction) by andfor the use of the Federal Government through purchase or lease, whether the supplies or services arealready in existence or must be created, developed, demonstrated, and evaluated. Acquisition begins atthe point when agency needs are established and includes the description of requirements to satisfyagency needs, solicitation and selection of sources, award of contracts, contract financing, contractperformance, contract administration, and those technical and management functions directly related tothe process of fulfilling agency needs by contract. Acquisition Life-cycle: The period covering allacquisition- related activities. The acquisition life-cycle for software development services begins when

Page 103: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

agency needs are established and ends with the conclusion of the services. Acquisition Strategy: Theset of decisions that determines how products and services will be acquired, including contractingmethod, contract duration, contract pricing, and quantities. Agency Procurement Request (APR): Arequest by an agency for contracting authority above their regulatory or specific agency delegation.Analysis of Alternatives: The process of determining how an agency's needs for products and serviceswill be met. Analytical Studies: Activities to determine an agency's needs and technical and acquisitionoptions. Application Software: Programs that perform specific tasks, such as word processing ordatabase management. Archiving: Moving electronic files no longer being used to less accessible andusually less expensive storage media for safe keeping. Baseline: A version of software used as astarting point for later versions. Batch Mode: Grouping all files related to a specific job andtransmitting them as a unit. Also referred to as deferred time or off-line processing. Benchmark: A testof the performance and capabilities of newly developed software using actual or simulated workloads.Best and Final Offer (BAFO): A final opportunity for offerors in the competitive range to reviseproposals. Black Box Testing: A method of verifying that software functions perform correctly withoutexamining the internal logic. Code: A set of symbols that represents data or instructions used indeveloping software programs. Commerce Business Daily (CBD): A daily publication that lists theGovernment's proposed contract actions, contract awards, subcontracting leads, sales, surplus property,and foreign business opportunities. Commercial Software: Software available through lease orpurchase in the commercial market from a concern representing itself to have ownership of marketingrights in the software. It includes software furnished as part of the ADP system but separately priced.Competitive Range: The offerors selected who have a reasonable chance of being awarded a contractafter proposal evaluation, based on total cost and other factors stated in the solicitation. Thecompetitive range is established for the purposes of conducting oral or written discussions andrequesting best and final offers. Compiler: A computer program that translates large sections of sourcecode into object code the computer can understand. Component Computer-Aided SoftwareEngineering (C-CASE) Tools: Upper and Lower CASE components from multiple vendors integratedto work together. Computer-Aided Software Engineering (CASE) Tools: A class of software tools thatprovide plans, models, and designs. CASE tools enforce consistency across multiple diagrams andstore information built up by analysts and designers in a central repository. Also referred to asComputer-Aided Systems Engineering. Configuration Management: The process of controlling thesoftware and documentation so they remain consistent as they are developed or changed. ContractAdministration: Management of a contract to ensure that the Government receives the products andservices specified within established costs and schedules. Contracting Officer (CO): The person withthe authority to enter into, administer, and/or terminate contracts and make related determinations andfindings for the Government. Contracting Officer's Representative (COR): An individual to whom theCO delegates certain contract administration responsibilities. Contracting Officer's TechnicalRepresentative (COTR): An individual to whom the CO delegates certain contract administrationresponsibilities, usually related to technical acceptance issues. Contractor: An entity providingproducts and services to the Government under a contract. Conversion: Changing data and/or existingsoftware into another format. Cost Evaluation Panel (CEP): The individuals responsible, during thesource selection process, for performing the cost evaluation of proposals submitted in response to asolicitation. Cost-Reimbursement Contract: A contract which provides for payment of allowableincurred costs to the extent prescribed in the contract. Custom-developed Software: Programsdeveloped for specific purposes. Data: Factual information stored on a computer. DatabaseManagement System (DBMS): Computer software used to create, store, retrieve, change, manipulate,sort, format and print information in a database. Also, software that controls the organization, storage,retrieval, security and integrity of data in a database. Data Dictionary: In a database managementprogram, an on-screen listing of all the database files, indices, views, and other files relevant to adatabase application. Debugging: To correct errors in hardware or software. Delegation of ProcurementAuthority (DPA): Authority provided by GSA to contract for products and services. Discussions: Oralor written communications between the Government and an offeror that involve information essentialfor determining the acceptability of a proposal or provide an offeror an opportunity to revise or modifyits proposal. Environment: The hardware and operating system on which application software operates.Evaluation Factors: The aspects of a proposal evaluated quantitatively or qualitatively to arrive at an

Page 104: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

integrated assessment as to which proposal can provide the best solution to meet the Government'srequirements as described in the solicitation. Evaluation Standards: Indicators that serve asmeasurement guides to determine how well an offeror's response meets the requirements of thesolicitation. Executable Code: Programs in machine language ready to run in a particular computerenvironment. Federal Acquisition Regulation (FAR): The regulation that codifies uniform acquisitionpolicies and procedures for use by executive agencies to acquire supplies and services withappropriated funds. All Federal agencies use the FAR in conjunction with the FIRMR when acquiringFIP resources. Federal Information Processing (FIP) Resources: Automatic data processing equipment(ADPE) as defined in Public Law 99-500 (40 U.S.C. 759(a)(2)), and set out in paragraphs (a) and (b) ofthis definition. (a) Any equipment or interconnected system or subsystems of equipment that is used inthe automatic acquisition, storage, manipulation, management, movement, control, display, switching,interchange, transmission, or reception, of data or information -- (1) by a Federal agency, or (2) under acontract with a Federal agency which -- (i) requires the use of such equipment, or (ii) requires theperformance of a service or the furnishing of a product which is performed or produced makingsignificant use of such equipment. (b) Such term includes -- (1) computers; (2) ancillary equipment; (3)software, firmware, and similar procedures; (4) services, including support services, and (5) relatedresources as defined by regulations issued by the Administrator for General Services. FederalInformation Resources Management Regulation (FIRMR): The regulation that sets forth uniformpolicies and procedures for acquiring, managing, and using FIP resources and Federal records. Federalagencies use it in conjunction with the FAR. Fee: The portion of total remuneration to a contractor overand above allowable costs. FIP Support Services: Any commercial, nonpersonal services, includingFIP maintenance, used to support FIP equipment, software, or services. Fixed-Price Contract: Acontract which provides for a firm price, or in appropriate cases, an adjustable price. GovernmentFurnished Equipment (GFE): Equipment in the possession of or acquired by the Government that ismade available to a contractor. Government Furnished Information (GFI): Information in thepossession of or acquired by the Government that is made available to a contractor. Graphical UserInterface (GUI): A combination of menus, screen design, keyboard commands, command language,and help screens that together create the way a user interacts with a computer. Allows users to move inand out of programs and manipulate their commands by using a pointing device (often a mouse).Imaging System: A method of translating and recording pictures in microfilm, videotape or computerformat. Independent Validation and Verification (IV∧V): Review, analysis,and testing conductedby an independent entity throughout the life-cycle of software development to ensure that the newsoftware meets contract requirements and agency needs. Information Resources Management (IRM):Planning, budgeting, organizing, directing, training, promoting, controlling, and management activitiesassociated with the burden, collection, creation, use, and dissemination of information by agencies,including management of information and related resources, such as Federal information processingresources. Integrated Computer-Aided Software Engineering (I-CASE) Tools: Software tools thatprovide for planning, analysis, and design with fully-integrated code generation. These tools are fullyintegrated so one tool component directly employs information from another. A repository stores theknowledge from multiple tools in an integrated manner. Also referred to as Integrated Computer-AidedSystems Engineering. Integration Test: A process to confirm that program units are linked together andinterface with the files/databases correctly. Interactive Mode: The ability to interact or converse with acomputer by giving commands and receiving responses in real time. Liquidated Damages:Compensation paid to the government by a contractor when it is late in delivering goods or performingservices. Local Area Network (LAN): A system of hardware and software connected by a commontransmission medium for the purpose of sharing common hardware, software, and data resourceswithin a limited geographical area. Magnetic Storage: Primary storage media (disk or tape) forcomputers. Media: Material that stores or transmits data, such as hard or floppy disks, magnetic tape,coaxial cable, and twisted wire pair. Metrics: Quantitative means of measuring software development.Negotiated Acquisition: The method of contracting in which offerors submit proposals in response to asolicitation. The proposals are evaluated and terms negotiated prior to award. Object Code: Instructionsin machine-readable language produced by a compiler or assembler from source code. Offeror: Anentity that responds to a Government solicitation. Operating System: A master control program or setof programs that manages the basic operations of a computer system. Operation Control Language

Page 105: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

(OCL): A programming language used to code a statement that identifies or describes requirements toan operating system. Packaged Software: The uncustomized portion of commercial software.Performance Measures: Standards by which a contractor's work is measured. Pilot Testing: Using alimited version of software in restricted conditions to discover if the programs operate as intended.Portable Operating System for Computer Environments (POSIX): An IEEE, vendor-independentdefinition for a standard interface between application programs and an underlying operating system.Program Manager (PM): The key management official who represents the agency's program office informulating FIP resources requirements and managing presolicitation activities. In some organizations,the program manager or another management official is designated as the acquisition manager for aspecific acquisition. Proposal: Response to a solicitation when contracting by negotiation orparticipating in the first step of two-step sealed bidding. Protest: A written objection by an interestedparty to: (1) a solicitation for a proposed contract, (2) a proposed award, or (3) the award of a contract.Prototyping: Creating a demonstration model of a new system. Quality Predictor: A measure forestimating the likelihood of a contractor's delivering high quality services. Random Access Memory:The computer's primary working memory in which program instructions and data are stored so thatthey are accessible directly to the central processing unit. Rapid Application Development (RAD): Amethodology for developing software systems that relies on prototyping techniques and extensive userinteraction. Recovery: The way a computer system resumes operation after overcoming a hardwareproblem or a software error. Regression Testing: A method to ensure changes to one part of thesoftware do not adversely impact other areas. Request for Comment (RFC): An announcement in theCommerce Business Daily or other publication requesting industry comment on draft specifications orsolicitations for FIP resources. Request for Information (RFI): An announcement in the CommerceBusiness Daily or other publication requesting information from industry about a planned acquisitionand, in some cases, corporate capability information. Request for Proposals (RFP): A type ofsolicitation used in contracting by negotiation to communicate Government requirements and to solicitproposals. Requirements Analysis: The process of determining an agency's need for FIP resources.Requirements Traceability Matrix (RTM): An automated tool that maps functional requirements tophysical configuration items, such as computer programs or databases. Scope: The description in theStatement of Work of all products and services to be provided by the contractor. Software: System,utility, or application programs expressed in a computer-readable language. Software DevelopmentServices: Services provided to develop new application software. The services may also include front-end analysis and back-end testing and implementation tasks. Software Development Life-cycle: Thesequence of events in the development of software. Software Enhancement: Significant functional orperformance improvements. Software Maintenance: Activities that modify software to keep itperforming satisfactorily. Software Operation: Routine activities that make the software performwithout modification. Software Release: An updated version of commercial software to correct errors,resolve incompatibilities, or improve performance. Solicitation: An official Government request forbids or proposals. Source Selection Plan (SSP): The document that explains how proposals will besolicited and evaluated in order to make the source selection decision. Specification: A description ofthe technical requirements for a material, product, or service. Standard: An established basis ofperformance used to determine quality and acceptability. Statement of Work (SOW): A detailed andcomplete description of requirements prepared for inclusion in a solicitation. System Software: A seriesof control programs, including the operating system, communications software, and database manager.Technical Evaluation Panel (TEP): The individuals responsible, during the source selection process, forperforming the technical evaluation of proposals submitted in a response to a solicitation. TechnicalLeveling: Helping an offeror bring its proposal up to the level of other proposals. TechnicalTransfusion: Disclosing technical information from one or more vendor's proposals that allowscompeting vendors to improve their proposals. Terminate-and-Stay-Resident (TSR): Programs thatremain in memory so that they can be instantly accessed over another application by pressing a key.The program is displayed either as a small window on top of the existing text or image or on the fullscreen. Test Plan: A plan prepared by the Government or contractor that details the specific tests andprocedures for determining if FIP resources comply with contract requirements. Throughput: Thespeed with which a computer processes data. Time-and-Materials Contract: A contract in which thegovernment reimburses a contractor for total labor charges (based on time and expended at fixed labor

Page 106: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

rates) and for materials used to complete the work. Transactions: An activity or request to a computer.Orders, purchases, changes, additions, and deletions are examples of transactions that are recorded in abusiness information environment. Uniform Contract Format (UCF): The format required by the FARfor preparation of solicitations and contracts. User Interface: A combination of menus, screen design,keyboard commands, command language, and help screens that together create the way a user interactswith a computer. Hardware, such as a mouse or touch screen, is also included. Version: A newpublication of commercial software reflecting major changes made in functions. White Box Testing: Amethod to examine the internal structure of a program/module to determine if the logic paths correctlyperform the functions required. Wide Area Network (WAN): A computer network capable oftransmissions over a large geographical area, generally linking several smaller LANs together for thepurpose of transmitting information. Workload: A collection of logically distinct, identifiable problemsthat an information system acts upon to support agency functions (e.g., payroll).

LIST OF ACRONYMSAcronym Definition 4GLs Fourth Generation Programming Languages ACO AdministrativeContracting Officer ADP Automatic Data Processing ADPE Automatic Data Processing EquipmentANSI American National Standards Institute APR Agency Procurement Request ASCII AmericanStandard Code for Information Interchange BAFO Best and Final Offer CASE Computer-AidedSoftware Engineering (also referred to as Computer-Aided Systems Engineering) CBD CommerceBusiness Daily C-CASE Component Computer-Aided Software Engineering (also referred to asComponent Computer-Aided Systems Engineering) CCB Configuration Control Board CEP CostEvaluation Panel CICA Competition in Contracting Act CLIN Contract Line Item Number COContracting Officer COAT Council of Accessible Technology COCA Clearinghouse on Computer Accommodation COR Contracting Officer's Representative COTR Contracting Officer's TechnicalRepresentative COTS Commercial Off-the-Shelf Software CPAF Cost-Plus-Award-Fee Contract CPFFCost-Plus-Fixed-Fee Contract CPIF Cost-Plus-Incentive-Fee Contract CPU Central Processing UnitDBMS Data Base Management System DCAA Defense Contract Audit Agenc y DoD Department ofDefense DPA Delegation of Procurement Authority DSO Designated Senior Official FAR FederalAcquisition Regulation FDR Functional and Data Requirements FEDSIM Federal SoftwareManagement Center FFP Firm-Fixed-Price Contract FIP Federal Information Processing FIPS PUBFederal Information Processing Standards Publication FIRMR Federal Information ResourcesManagement Regulation FISSP Federal Information Systems Support Program FMSS FinancialManagement Systems Software FOIA Freedom of Information Act FPE Fixed-Price with EconomicPrice Adjustments Contract FPI Fixed-Price-Incentive Contract FPMR Federal Property ManagementRegulation GAO General Accounting Office GFE Government Furnished Equipment GFI GovernmentFurnished Information GOSIP Government Open System Interconnection Profile GSA GeneralServices Administration GSBCA General Services Board of Contract Appeals GUI Graphical UserInterface HIPO Hierarchical Input-Process-Output I-CASE Integrated Computer-Aided SoftwareEngineering (also referred to as Integrated Computer-Aided Systems Engineering) IEEE Institute ofElectronic and Electrical Engineering IFB Invitation for Bids IGCE Independent Government CostEstimate I/O Input/Output IRM Information Resources Management IRMS Information ResourcesManagement Service, GSA ISO International Standards Organization IV∧V IndependentVerification and Validation JAD Joint Application Development KM Office of Information ResourcesManagement Policy, IRMS, GSA KMA Acquisition Reviews Division, IRMS, GSA KAL AgencyLiaison Officer Program Division, IRMS, GSA KMP Policy Analysis Division, IRMS, GSA LANLocal Area Network MAS Multiple Awards Schedule MTBF Mean-Time-Between-Failures NISTNational Institute of Standards and Technology NSEP National Security and Emergency PreparednessProgram OCL Operation Control Language OFPP Office of Federal Procurement Policy OMB Officeof Management and Budget OOP Object-Oriented Programming OTA Office of Technical AssistanceP∧CV Performance and Capability Validation PCO Procuring Contracting Officer PM ProgramManager POSIX Portable Operating System Interface for Computer Environments POTS Purchase ofTelecommunications Services Program QA Quality Assurance RAD Rapid Application DevelopmentRAM Random Access Memory RFC Request for Comment RFI Request for Information RFP RTM

Page 107: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Request for Proposals Requirements Traceability Matrix SBA Small Business Administration SISSystems Integration Services SOW Statement of Work SQL Standard Query Language SSA SourceSelection Authority SSAC Source Selection Advisory Council SSEB Source Selection Evaluation Board SSP Source Selection Plan SST Source Selection Team TEP Technical Evaluation PanelT∧M Time-and-Materials Contract TSR Terminate-and-Stay Resident Program UCF UniformContract Format USC United States Code WBS Work Breakdown Structure A GUIDE FORACQUIRING SOFTWARE DEVELOPMENT SERVICES

APPENDIX B: LEGISLATIVE AND REGULATORYENVIRONMENTThe regulatory framework surrounding the Federal acquisition process consists of Federal laws;Governmentwide regulations, directives, and guidance; and individual agency regulations, directives,and guidance. This appendix contains summaries of those that most directly impact the acquisition ofFederal information processing (FIP) resources and, therefore, the software development process.

B.1 FEDERAL LAWS

a. Brooks Act (P.L. 89-306)The Brooks Act, enacted in 1965, establishes the basic policy for the management of automated dataprocessing equipment (ADPE)*. Public Law 99-500, the Paperwork Reduction Reauthorization Act of1986, expands and clarifies its scope to include telecommunications resources, software, andcomputer-related services such as computer service bureaus and contract programming. The BrooksAct grants specific authority and responsibility to the General Services Administration (GSA), theOffice of Management and Budget (OMB), and the Department of Commerce (DOC).

b. Warner Amendment (P.L. 97-86, P.L. 99-500)The Warner Amendment, enacted in 1981, exempts the Department of Defense (DoD) from the BrooksAct for certain applications. The same exemption was subsequently incorporated into the Brooks Act,as amended by Public Law 99-500 in 1986.

c. Paperwork Reduction Act (P.L. 96-511)The Paperwork Reduction Act was enacted in 1980 to ensure that automated data processing (ADP)and telecommunications technologies are acquired and used in a manner that improves service deliveryand program management, increases productivity, reduces waste and fraud, and -- wherever practicableand appropriate -- reduces the information processing burden for the Federal Government and forpersons who provide information to the Government. OMB's Office of Information and RegulatoryAffairs is assigned overall authority for implementation of the Act and has defined paperworkreduction requirements in OMB Circular A-130.

d. Paperwork Reduction Reauthorization Act of 1986 (P.L. 99-500)The Paperwork Reduction Reauthorization Act significantly expands the Brooks Act's definition ofADPE to reflect the merging of automatic data processing, telecommunications, and relatedtechnologies. The law defines ADPE as any equipment or interconnected system or subsystems ofequipment that is used in the automatic acquisition, storage, manipulation, management, movement,control, display, switching, interchange, transmission, or reception of data or information. Thisincludes: (1) computers; (2) ancillary equipment; (3) software, firmware, and similar procedures; (4)services, including support services; and (5) related resources as defined by regulations issued by GSA.The Act combines the old ADP and Federal Telecommunications Funds into one InformationTechnology (IT) Fund and gives the General Services Board of Contract Appeals (GSBCA) permanentjurisdiction over ADPE protest resolution.

Page 108: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

e. Competition in Contracting Act (P.L. 98-369)The Competition in Contracting Act (CICA), enacted in 1984, mandates a policy of full and opencompetition, with certain limited exceptions. The Act requires: o Collection of market research data,costs, and pricing information before preparing a solicitation o Evaluation of proposals only on factorsincluded in the solicitation o Designation of competition advocates for each agency and procuringorganization of the agency CICA reinforces the Government's policy to place a fair proportion of itsacquisitions, including contracts and subcontracts, with small businesses and small disadvantagedbusinesses.

f. Computer Security Act of 1987 (P.L. 100-235)The Computer Security Act of 1987 amended several laws to add provisions to protect computer-related assets (e.g., hardware, software, and data). The Act assigns responsibility for the developmentof computer security guidelines and standards to the National Institute of Standards and Technology(NIST). The Act also requires agencies to identify systems that contain sensitive data and develop asecurity plan for each of them.

g. Privacy Act of 1974 (P.L. 93-579)The Privacy Act of 1974 provides for the protection of information about individuals maintained inFederal information systems. It establishes specific criteria for maintaining the confidentiality ofsensitive data and guidelines for determining which data are covered. According to the Act, Federalagencies and employees are responsible for: (1) maintaining the confidentiality of data covered by theAct and (2) taking actions necessary to reasonably ensure that the data about individuals maintained inFederal information systems are accurate. OMB Circular A-130 and the Federal Information ResourcesManagement Regulation (FIRMR) implement provisions of the Privacy Act. Another resource forassistance in complying with the Privacy Act is FIPS PUB 41, Computer Security Guidelines forImplementing the Privacy Act of 1974.

h. Service Contract Act (41 USC 351 et seq.)The Service Contract Act of 1965, as amended, covers minimum wages, fringe benefits, notification toemployees of minimum allowable compensation, and safe and sanitary working conditions under mosttypes of service contracts. In addition, the Act places a limitation of five years on certain servicecontracts.

i. Office of Federal Procurement Policy (OFPP) Act (41 USC 423)The OFPP Act includes procurement integrity provisions (commonly referred to as the ProcurementIntegrity Act), which prohibit certain activities by contractors and Government officials during theconduct of Federal procurements. In general, the prohibited activities involve soliciting or discussingpost-Government employment, offering or accepting a gratuity, and soliciting or disclosing proprietaryor source selection information. In addition, the procurement integrity provisions cover (1) certificationand enforcement matters and (2) sanctions, including administrative actions and contractual, civil, andcriminal penalties. Much of the conduct that the Act prohibits is also covered by other statutes andregulations.

j. Copyright Laws (17 USC)The Copyright Act of 1980 amended the copyright laws to recognize the realities of modern dataprocessing systems. Section 117 permits copying of copyrighted software for backup or archivalpurposes if a copy is required to install the software. Management must ensure that procedures are inplace to prevent unauthorized use or duplication of copyrighted programs. These procedures mustinclude appropriate disciplinary action.

Page 109: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

k. Trade Secrets Act (18 USC 1905)The Trade Secrets Act established specific penalties for the improper disclosure of trade secretsentrusted to Government agencies.

l. Patent and Trademark Laws (35 USC)When a system contains or uses patented hardware or software, users have the responsibility to protectthe rights of the patent holder. Specifically, the user must ensure that the patented item is notimproperly disclosed, used, or copied.

m. Contract Disputes Act of 1978 (P.L. 95-563)This Act provides for the resolution of claims and disputes relating to Government contracts awardedby Executive agencies. The Act-- o Establishes requirements for contracting officers to considercontractor and Government claims; o Authorizes agencies, under certain conditions, to establish aBoard of Contract Appeals; o Establishes rights of contractors to appeal a contracting officer's decisionto the agency's Board of Contract Appeals or directly to the courts; o Requires agencies which have notestablished a Board of Contract Appeals to use another agency's Board; o Establishes that a contractoror the Government may appeal decisions of a Board of Contract Appeals to the Court of Appeals forthe Federal Circuit; o Requires the Boards to establish accelerated procedures for optional use indisputes of $50,000 or less; and o Requires the Boards to establish small claims procedures foroptional use in disputes of $10,000 or less.

B.2 GOVERNMENTWIDE REGULATIONS, DIRECTIVES, ANDGUIDANCE

a. Federal Acquisition Regulation (FAR)The FAR is the primary regulation used by all Executive agencies for the acquisition of supplies andservices with appropriated funds. It was developed in accordance with the requirements of the OFPPAct of 1974, as amended by Public Law 96-83, to consolidate the procurement regulations of GSA,DoD, and the National Aeronautics and Space Administration (NASA) into a single regulationproviding for coordination, simplicity, and uniformity in the acquisition process.

b. Federal Information Resources Management RegulationThe FIRMR governs the acquisition, management, and use of ADP and telecommunications resources.It also applies to the creation, maintenance, and use of Federal records. The FIRMR was developed toensure that information processing resources are acquired, managed, and used to improve servicedelivery and program management, increase productivity, reduce waste and fraud, and minimizepaperwork burdens. Explanations and guidance are now contained in a series of bulletins. The FIRMRis used in conjunction with the FAR when acquiring FIP resources by contracting. The FIRMR relieson the FAR for general policies and procedures. The policies and procedures in FIRMR Part 201-39are in addition to, not in lieu of, the FAR, except when the FIRMR specifically requires that agenciesfollow its policies and procedures. The FIRMR also requires Federal agencies not subject to the FARto follow the FAR when acquiring FIP resources subject to the FIRMR.

c. Office of Management and Budget CircularsOMB has issued a number of circulars that directly impact the acquisition of FIP resources. Theyinclude: o A-130, Management of Federal Information Resources -- Consolidates policy for themanagement of FIP resources in Executive agencies. In addition, it assigns responsibilities to specificagencies and contains information regarding-- oo Agency responsibilities for maintaining recordsabout individuals; oo Cost accounting, cost recovery, and interagency sharing of informationtechnology facilities; and oo Security of agency automated information systems. o A-76, Policies for

Page 110: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

Acquiring Commercial or Industrial Products and Services Needed by the Government -- Containspolicies and procedures for determining whether functions should be performed by outside sources(such as contractors) or by Government personnel. In the commercial sector, contracting out functionsare referred to as "outsourcing." o A-109, Major System Acquisitions -- Establishes policies foragencies to follow when acquiring major systems. The circular directs agencies to-- oo Express theirneeds and objectives in mission and functional terms; oo Interact with Congress; oo Establish clearlines of responsibility, authority, and accountability; oo Allow competitive exploration of alternativesystem design concepts (usually through multiple contract awards for the design phase, with thecontract for building the system awarded to the vendor whose design is in the Government's bestinterest); and oo Look to industry to supply needs in accordance with OMB Circular A-76.

d. Agency-Specific GuidanceSome agencies are subject to special regulations or have internal directives concerning acquisition ofFIP resources. Contact both the agency's contract and IRM/technical staff for agency-specific guidance.

B.3 OVERSIGHT ORGANIZATIONSThe following five Government organizations have primary authority for overseeing various aspects ofthe acquisition of FIP resources.

a. General Services AdministrationThe Brooks Act directs GSA to coordinate and provide for the economic and efficient purchase, lease,and maintenance of ADPE by Federal agencies. GSA has exclusive authority to procure FIP resources,with the power to delegate procurement authority to Federal agencies to the extent GSA determinesnecessary or desirable. The Brooks Act prohibits GSA from impairing or interfering with an agency'sdetermination of its FIP resources requirements, including developing specifications for and selectingtypes and configurations of needed resources. GSA also cannot control an agency's use of theresources. GSA issues and maintains changes to the FIRMR and conducts other acquisition activitiessuch as issuing delegations of procurement authority (DPA). GSA also issues and maintains theFederal ADP and Telecommunications Standards Index. This index provides a comprehensive list ofFederal standards and terminology to permit agencies to incorporate standards into solicitations easily.

b. Office of Management and BudgetUnder the Brooks Act, OMB is charged with fiscal control and the development of administrative andmanagement policy for FIP resources. OMB has assigned the day-to-day management functions toGSA. The Paperwork Reduction Act (and its reauthorization) granted OMB broad authority concerningthe planning, budgeting, organizing, directing, training, promoting, controlling, and other managerialactivities involving the collection, use, and dissemination of information. In carrying out itsresponsibilities for FIP resources acquisition and management, OMB has issued a number of circulars,which are described above.

c. National Institute of Standards and TechnologyNIST is an agency of the Department of Commerce. Under the Brooks Act, NIST is responsible forproviding scientific and technological services to agencies for ADPE and for developing andmaintaining standards to maximize their ability to share computer programs and data. NIST publishesFederal Information Processing Standards Publications generally called FIPS PUBS, to fulfill thispurpose. FIPS PUBS include standards, guidelines, and program information documents. They areclassified into seven categories: 1) General Publications 2) Hardware Standards and Guidelines 3)Software Standards and Guidelines 4) Data Standards and Guidelines 5) ADP Operations Standardsand Guidelines 6) Related Telecommunications Standards 7) Conformance Tests The FIPS PUBSIndex, updated regularly, lists the most current information.

Page 111: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

d. General Services Board of Contract AppealsThe GSBCA is responsible for resolving disputes between contractors and Government agencies,including protests filed under CICA that involve FIP resources. The Board was established by theContract Disputes Act of 1978 (41 USC 601 et seq.) as an independent tribunal, but functionallylocated within GSA. It can grant the same relief available in the U.S. Claims Court. GSBCA hears mostmajor protest cases for FIP resources contracts.

e. General Accounting Office (GAO)The Budget and Accounting Act of 1921 (31 USC Chapter 7) established the GAO to auditGovernment agencies independently. Over the years, Congress has expanded GAO's audit authority,added new responsibilities and duties, and strengthened its ability to perform independently. GAO'sfundamental responsibility is supporting Congress. To meet this objective, GAO performs a variety ofservices, the most prominent of which are audits and evaluations of Government programs andactivities, including major FIP resources acquisitions. GAO, in addition to the GSBCA, is authorized tohear and decide protest cases. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENTSERVICES

APPENDIX C: BIBLIOGRAPHYFederal Software Management Support Center (FSMC), Office of Software Development andInformation Technology (OSD∧IT), General Services Administration (GSA). ConversionManagement Handbook. Washington: 1989. Federal Software Management Support Center (FSMC),Office of Software Development and Information Technology (OSD∧IT), General ServicesAdministration (GSA). Model for the Acquisition of Software Support Services. Washington: 1988.Federal Systems Integration and Management Center (FEDSIM), Information Resources ManagementService (IRMS), General Services Administration (GSA). A Guide to Alternative RequirementsAnalysis Methodologies. Washington: 1990. Federal Systems Integration and Management Center(FEDSIM), General Services Administration (GSA). A Phased Approach to Life Cycle Management.Washington: 1988. Federal Systems Integration and Management Center (FEDSIM), General ServicesAdministration (GSA). Survey of Life Cycle Management Methodologies. Washington: 1988.Humphrey, W.S., and Sweet, W.L. A Method for Assessing the Software Engineering Capability ofContractors. Pittsburgh: Software Engineering Institute, Carnegie Mellon University: 1987.Information Resources Management Service (IRMS), General Services Administration (GSA).Software Wellness Checklist: A Self- Assessment Guide for Federal Managers. Washington: 1989. AGUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

APPENDIX D: GSA ASSISTANCE PROGRAMSGSA has in place a number of programs to provide assistance to Federal agencies in acquisition andmanagement of Federal information processing (FIP) resources. These programs, administered by theOffice of Technical Assistance (OTA), provide nonmandatory reimbursable technical and contractualassistance through the Information Technology fund. OTA serves as a source of expertise ininformation resources technology, management, and acquisition for Government. In addition, OTAresearches, develops, and disseminates Governmentwide publications, surveys, and monographs tofacilitate the exchange of expertise and knowledge throughout the Federal Information ResourcesManagement (IRM) community. OTA sponsors and supports three separate and distinct programs --the Federal Systems Integration and Management Center (FEDSIM), the Federal Information SystemsSupport Program (FISSP), and the Federal Computer Acquisition Center (FEDCAC). These programsare discussed below.

a. Federal Systems Integration and Management CenterFEDSIM delivers a wide range of reimbursable FIP support services including IRM Planning, systemsacquisition, systems design and integration, software management, facilities management, and

Page 112: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

development of local and wide area networks. FEDSIM assistance includes planning, defining, anddocumenting system requirements; and effectively managing, operating, and maintaining informationsystems. FEDSIM provides in-house expertise through its own staff and leverages this expertisethrough contracts with qualified contractors from the vendor community. For more information on theFEDSIM program, please call (703) 756-4300.

b. Federal Information Systems Support ProgramFISSP provides Federal agencies technical and administrative support in acquiring system definition,design, and requirements analysis services; business and scientific software programming services;computer security studies; risk analysis services; and facility management, including local areanetwork management and data capture and retrieval services. FISSP also provides technical assistancein developing statements of work, price negotiation, contract and project management, and financialmanagement. FISSP provides this support to agencies nationwide utilizing task orders throughpreestablished zonal requirements contracts. Contracting for consolidated requirements allows FISSPto achieve economies through volume buying and avoiding multiple procurements. FISSP provideseconomies of specialization through value-added technical and procurement expertise. FISSPpersonnel act as technical consultants, project managers, and contract administrators. Agencies mayobtain further information about the scope, availability, and terms of these contracts by contacting theappropriate FISSP Zones listed below: Eastern Zone - (215) 656-6300 Capital Zone - (202) 708-7700Central Zone - (205) 895-5091 Western Zone - (817) 334-8434 Pacific Zone - (415) 744-8527

c. Federal Computer Acquisition CenterFEDCAC provides acquisition services resulting in competitive acquisition of large non-developmentalinformation systems, including hardware, systems software, and associated services. Through use ofin-house technical and contracts staff, FEDCAC assists agencies by defining requirements, developingcompetitive solicitations, and negotiating and awarding contracts for FIP resources to support theagency's needs. In conducting an acquisition FEDCAC will-- o Provide overall management ofacquisitions by establishing project schedules and milestones o Provide technical guidance to clients indevelopment of benchmark requirements and provide leadership for live test demonstrations o Provideindependent reviews of source selection material by procurement review committees, legal staffcounsels, and quality review panels o Develop cost evaluation models and determine the lowest presentvalue life cycle cost for each offeror o Develop pre-negotiation objectives, negotiate, prepare best andfinal offer requests, and award resulting contracts For more information on FEDCAC services, pleasecall (617) 863-0104. A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

APPENDIX E: CONTACTS FOR MORE INFORMATIONSYMBOL OFFICE FTS or COMMERCIAL PHONE NO. GA Administrative and Technical ServicesDivision (GSBCA Protest Results) (202) 501-0116 KB Assistant Commissioner forTelecommunications Services (202) 501-5308 KE Office of Information Resources Procurement (202)501-1072 KECP ADP Systems Procurement Branch (202) 501-0851 KEN Network ProcurementDivision (703) 760-7480 KES Schedules Division (202) 501-1840 KET TelecommunicationsProcurement Division (202) 501-1076 KGDO Clearinghouse on Computer Accommodation (COCA)(202) 501-4906 KM Office of Information Resources Management Policy (202) 501-0202 KMAAcquisition Reviews Division (202) 501-1126 KAL Agency Liaison Officer Program Division (TrailBoss and Go for 12 Programs) (202) 501-0819 KMM Management Reviews Division (202) 501-1332KMP Policy Analysis Division (202) 501-2462 KMR Regulations Analysis Division (202) 501-3194KR Office of Technical Assistance (703) 756-4300 KRA Federal Systems Management Center,FEDSIM (703) 756-4111 KRD Federal Systems Integration Center, FEDSIM (703) 756-4162 KREFederal Systems Acquisition Center, FEDSIM (703) 756-4201 KRF Federal Computer AcquisitionCenter, FEDCAC (617) 863-0104 KRO Federal Office Systems Center, FEDSIM (703) 756-6900 KRSFederal Software Management Center, FEDSIM (703) 756-4500 KRT Federal Information SystemsSupport Program, FISSP (703) 756-4227 KVT Technical Contract Management Division (202) 501-

Page 113: A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENT SERVICES

3881 KXMA Financial Analysis Branch (Information Technology Fund) (202) 501-1183 TNAFTS2000 Service Oversight Center A (SOC A) (703) 760-7515 TNB FTS2000 Service OversightCenter B (SOC B) (703) 904-2811 A GUIDE FOR ACQUIRING SOFTWARE DEVELOPMENTSERVICES

APPENDIX F: OFFICE OF FEDERAL PROCUREMENTPOLICY (OFPP) POLICY LETTER 91-2