Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED...

27
Exhibit 3 Deliverable Document Review Plan New York State Office for People With Developmental Disabilities Division of Service Delivery Electronic Health Record Project March, 2017

Transcript of Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED...

Page 1: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

Exhibit 3

Deliverable Document Review Plan

New York State

Office for People With Developmental Disabilities

Division of Service Delivery

Electronic Health Record Project

March, 2017

Page 2: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR 2017 Deliverable Document

Review Plan

Page i

Table Of Contents

1. INTRODUCTION ......................................................................................................................................... 3

1.1 DOCUMENT PURPOSE ....................................................................................................................................... 3

2. DELIVERABLE QUALITY ACTIVITIES ................................................................................................ 3

2.1 EVALUATION OF PROJECT DELIVERABLE DOCUMENTATION ............................................................................ 3 2.2 SUBMISSION AND EVALUATION PROCESS ........................................................................................................ 3

3. DELIVERABLE EVALUATION CRITERIA ........................................................................................... 7

3.1 GENERAL DELIVERABLE EVALUATION CRITERIA (GDE) ................................................................................. 7 3.2 BACKUP AND DISASTER RECOVERY PLAN EVALUATION CRITERIA (DCA) ..................................................... 8 3.3 DATA CONVERSION AUDITING EVALUATION CRITERIA (DCA) ....................................................................... 9 3.4 DATA MIGRATION PLAN EVALUATION CRITERIA (DMP) ................................................................................ 9 3.5 DESIGN EVALUATION CRITERIA (DES) .......................................................................................................... 10 3.6 HOSTING PLAN EVALUATION CRITERIA (HSE) .............................................................................................. 12 3.7 IMPLEMENTATION PLAN EVALUATION CRITERIA (IPE) ................................................................................. 12 3.8 PROJECT PLAN EVALUATION CRITERIA (PPE) ............................................................................................... 13 3.9 RELEASE MANAGEMENT PLANS EVALUATION CRITERIA (RPE) .................................................................... 16 3.10 REQUIREMENTS MANAGEMENT PLAN EVALUATION CRITERIA (RMP) .......................................................... 17 3.11 REQUIREMENTS EVALUATION CRITERIA (RE) ............................................................................................... 17 3.12 REQUIREMENTS TRACEABILITY MATRIX EVALUATION CRITERIA (RTM) ..................................................... 18 3.13 SITE AND SYSTEM SECURITY PLAN EVALUATION CRITERIA (SSSP) .............................................................. 18 3.14 SYSTEM VALIDATION EVALUATION CRITERIA (SVE) .................................................................................... 19 3.15 TEST AUDITING (TA) ..................................................................................................................................... 19 3.16 TEST PLAN EVALUATION CRITERIA (TPE) ..................................................................................................... 20 3.17 TEST STRATEGY EVALUATION CRITERIA (TSTR) .......................................................................................... 20 3.18 TEST CASE EVALUATION CRITERIA (TCSE) .................................................................................................. 20 3.19 TEST SCRIPT EVALUATION CRITERIA (TSE) .................................................................................................. 21 3.20 TEST RESULTS EVALUATION CRITERIA (TRE) ............................................................................................... 22 3.21 TRAINING PLAN EVALUATION CRITERIA (TRPE) .......................................................................................... 22

APPENDIX 1 EXECUTIVE SUMMARY ............................................................................................................... 25

APPENDIX 1 DETAILED FINDINGS ................................................................................................................... 25

APPENDIX 1 SATISFIED FINDINGS ................................................................................................................... 26

Table Of Tables

Table 1, EHR Deliverable Document Expectations and Evaluation Criteria ................................................................ 4 Table 2, General Deliverable Evaluation Criteria .......................................................................................................... 7 Table 3, Backup and Disaster Recovery Plan Evaluation Criteria ................................................................................. 8 Table 4, Data Conversion Auditing Evaluation Criteria ................................................................................................ 9 Table 5, Data Migration Plan Evaluation Criteria ......................................................................................................... 9 Table 6, Design Evaluation Criteria ............................................................................................................................ 10 Table 7, Hosting Plan Evaluation Criteria ................................................................................................................... 12 Table 8, Implementation Plan Evaluation Criteria....................................................................................................... 12 Table 9, Project Plan Evaluation Criteria .................................................................................................................... 13 Table 10, Release Management Plans Evaluation Criteria .......................................................................................... 16 Table 11, Requirements Management Plan Evaluation Criteria .................................................................................. 17 Table 12, Requirements Evaluation Criteria ................................................................................................................ 17 Table 13, Requirements Traceability Matrix Evaluation Criteria ................................................................................ 18 Table 14, Site and System Security Plan Evaluation Criteria ...................................................................................... 18 Table 15, System Validation Evaluation Criteria ........................................................................................................ 19

Page 3: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD Deliverable Document

Review Plan

Table 16, Test Auditing Evaluation Criteria ................................................................................................................ 19 Table 17, Test Plan Evaluation Criteria ....................................................................................................................... 20 Table 18, Test Strategy Evaluation Criteria ................................................................................................................. 20 Table 19, Test Case Evaluation Criteria ...................................................................................................................... 20 Table 20, Test Script Evaluation Criteria .................................................................................................................... 21 Table 21, Test Results Evaluation Criteria .................................................................................................................. 22 Table 22, Training Plan Evaluation Criteria ................................................................................................................ 22

Page 4: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

3

1. INTRODUCTION

1.1 Document Purpose

This document describes the deliverable documentation review process and evaluation criteria that

the OPWDD will utilize when evaluating the selected Bidder deliverable documentation for the

Office for People with Developmental Disabilities (OPWDD) Electronic Health Record (EHR)

project.

2. DELIVERABLE QUALITY ACTIVITIES

2.1 Evaluation of Project Deliverable Documentation

The evaluation of project deliverable documentation monitors the project deliverables throughout

development to verify that the documents are of acceptable quality and are complete and correct.

Clear and sufficient documentation of all project activities is necessary for many reasons but

mainly to justify payment and to ensure all of the features listed as within the scope of the project

in the established contract are fully implemented and operate successfully. Table 1, EHR

Deliverable Document Expectations and Evaluation Criteria identifies:

The major project deliverables and associated documentation that will be evaluated for

satisfactory quality level.

The quality standards, expectations, and evaluation criteria established for the project

deliverable. Included are any organizational standards that need to be followed.

Specific Evaluation Criteria are listed in Section 3, DELIVERABLE EVALUATION

CRITERIA

2.2 Submission and Evaluation Process

Outlined below is the process that will be followed to review and accept required deliverable

documents.

1. The deliverable document review process starts when the selected Bidder submits a

Deliverable Document for review. We strongly encourage the selected Bidder to submit

early drafts or document outlines to the OPWDD for informal review prior to completing

the formal submission. The OPWDD will then provide informal comments on the

deliverable document back to the selected Bidder. The intent is to ensure that all

stakeholder expectations are aligned as to the purpose and content of the document to be

submitted, before effort is expended that takes the document development in an unexpected

direction. The informal review process may consist of one-on-one phone calls and/or larger

group discussion. Any unresolved issues will be escalated as defined in the communication

plan. This informal submission/review process can have as many cycles as necessary in

order to prepare the deliverable document for fomal submission.

2. Once the document is ready for formal review, the selected Bidder will formaly submit the

deliverable document and the Deliverable Acceptance Form by uploading it to the OPWDD

EHR Project Sharepoint site and notifying the project team via email. The Deliverable

Page 5: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

4

Acceptance Form is available as Appendix G - Deliverable Acceptance Form available on

the EHR RFP 2017 web site.

3. The OPWDD will perform an initial document review, determine if the document is

technically and structurally complete, and then notify reviewers that the deliverable

document is ready for their subject matter experts detail review. The OPWDD will collect

and document any and all findings in a Deliverable Document Review Report (see

Attachment 1), place it into the OPWDD EHR Project Sharepoint site, and notify the

project team via email. The Deliverable Acceptance Form will also be updated to indicate

the level of acceptance (typically Unacceptable, Conditionally Acceptable, or Approved)

and placed on the OPWDD EHR Project Sharepoint site.

4. If the level of acceptance for the document is indicated as being Unacceptable or

Conditionally Acceptable, the selected Bidder will need to satisfactorily address all

findings outlined in the Deliverable Document Review Report and resubmit within 5

business days for review, following the process described above. If, after three (3) review

cycles, there are remaining unresolved findings and the document remains in an

Unacceptable or Conditionally Acceptable status, the issue will be raised to the OPWDD

Project Executive Leadership for determination on how to proceed.

Table 1, EHR Deliverable Document Expectations and Evaluation Criteria

Project

Deliverable Description Evaluation Criteria

Deliverable

Expectation

Documents

(DED)

The DED (one for each deliverable document)

outlines the expectations for each deliverable

document identified herein.

• General Deliverable Evaluation Criteria

Initial Startup

Schedule

Provided within 24 hours of contract signing

and details all activities to be undertaken

during the Project Initiation and Planning

phase.

• General Deliverable Evaluation Criteria

Project

Implementation

Plan

A narrative work plan clearly describing the

approach to this project, state specifically how

deliverables, including the optional financial

and billing modules, will be achieved by the

Bidder.

• General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Detailed Project

Schedule

A Schedule of major activities and milestones. • General Deliverable Evaluation Criteria

Weekly Status

Report Format

Draft outline of the weekly status report. • General Deliverable Evaluation Criteria

Project

Management Plan

A comprehensive set of plans to manage the

project from inception until project

termination.

• General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

• Requirements Management Plan

Evaluation Criteria

• Requirements Traceability Matrix

Evaluation CriteriaE

Detailed MS

Project Plan and

Schedule

MS Project Plan with a WBS of level 5. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Communication

Plan

Plan for managing communications, including

a project org chart.

• General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Page 6: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

5

Project

Deliverable Description Evaluation Criteria

Risk Management

Plan

Vendor plan for managing risk. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Issue

Management Plan

Vendor plan for managing issues. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Action Item

Management Plan

Vendor plan for managing action items. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Requirements

Management Plan

Vendor plan for managing requirements. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Change

Management Plan

Vendor plan for managing change requests,. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Quality

Management Plan

Vendor plan for managing and maintaining

quality.

• General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Staffing Plan Detailed vendor staffing plan. • General Deliverable Evaluation Criteria

• Project Plan Evaluation Criteria

Requirement

Traceability

Matrix (RTM)

Matrix for tracing requirements from the RFP

through design and final UAT testing.

• General Deliverable Evaluation Criteria

• Requirements Traceability Matrix

Evaluation Criteria

Gap Analysis Analysis of the requirements not implemented

in the core COTS product, which will require

further development activities.

• General Deliverable Evaluation Criteria

Design

Specification

Document

Defines the functional implementation of each

requirement.

• General Deliverable Evaluation Criteria

• Design Evaluation Criteria

Technical

Specification

Document

Defines the end to end system technical

architecture and specifications.

• General Deliverable Evaluation Criteria

Security Plan The Vendor’s comprehensive plan for meeting

all project security requirements.

• General Deliverable Evaluation Criteria

• Site and System Security Plan Evaluation

Criteria

• Backup and Disaster Plan Evaluation

Criteria

System Security

Consensus

Document

Contains all information relevant to the

system’s Certification & Accreditation.

Defines and consolidates the system’s security

architecture, security policies, risk assessments,

and security tests.

• General Deliverable Evaluation Criteria

• Site and System Security Plan Evaluation

Criteria

• Backup and Disaster Plan Evaluation

Criteria

Disaster Recovery

Plan

Plan to restore operability and protect data in

the event of extended interruption of services.

• General Deliverable Evaluation Criteria

• Site and System Security Plan Evaluation

Criteria

• Backup and Disaster Plan Evaluation

Criteria

Data Migration

Plan

Describes the detailed strategies and

approaches for converting, migrating, and

validating data.

• General Deliverable Evaluation Criteria

• Data Migration Plan Evaluation Criteria

Data Migration

Reports

(Exception

Reports)

Report on errors and exceptions in the data

migration process

• General Deliverable Evaluation Criteria

• Data Migration Plan Evaluation Criteria

Data Mapping Comprehensive mapping of data elements

between the existing and to-be systems.

• General Deliverable Evaluation Criteria

• Data Migration Plan Evaluation Criteria

Page 7: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

6

Project

Deliverable Description Evaluation Criteria

Data Dictionary Describes the contents, format, and structure of

the new database, and the relationship between

its elements.

• General Deliverable Evaluation Criteria

• Data Migration Plan Evaluation Criteria

Test Strategy Vendor strategy for all phases of system

testing.

• General Deliverable Evaluation Criteria

• Test Strategy Evaluation Criteria

Test Plan Vendor plan for implementing the test strategy. • General Deliverable Evaluation Criteria

• Test Plan Evaluation Criteria

Test Cases The set of conditions, variables, assumptions,

criteria, and expected results for system testing

of all functional requirements.

• General Deliverable Evaluation Criteria

• Test Case Evaluation Criteria

Test Scripts Procedures to be followed in order to perform

specific requirement tests.

• General Deliverable Evaluation Criteria

• Test Script Evaluation Criteria

Test Results Outcomes for each test script performed. • General Deliverable Evaluation Criteria

• Test Results Evaluation Criteria

Bug

Fix/Remediation

Plan

Vendor plan for managing exceptions

encountered during testing.

• General Deliverable Evaluation Criteria

• Test Results Evaluation Criteria

Release

Management Plan

Vendor plan for managing subsequent software

upgrades and release notes.

• General Deliverable Evaluation Criteria

• Release Management Plan Evaluation

Criteria

Help Desk Plan Detailed vendor plan for implementing the

Help Desk requirements.

• General Deliverable Evaluation Criteria

Training Plan Detailed vendor plan for implementing the

Training requirements.

• General Deliverable Evaluation Criteria

• Training Plan Evaluation Criteria

Training Schedule Detailed schedule for implementing the

Training Plan.

• General Deliverable Evaluation Criteria

• Training Plan Evaluation Criteria

Training Materials User Guides, Training Videos, Training

Guides, etc.

• General Deliverable Evaluation Criteria

• Training Plan Evaluation Criteria

Security

documentation

and artifacts as

requested

Documentation and artifacts demonstrating

compliance with security requirements,

including HIPAA, HITECH, FedRAMP,

MARS-E, and compliance with NYS

Information Security Policies and Standards.

• General Deliverable Evaluation Criteria

• Site and System Security Plan Evaluation

Criteria

• Backup and Disaster Plan Evaluation

Criteria

Page 8: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

7

3. DELIVERABLE EVALUATION CRITERIA

3.1 General Deliverable Evaluation Criteria (GDE)

Table 2, General Deliverable Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

GDE-1 Does the deliverable/document conform to the appropriate, applicable, and/or required

deliverable/documentation standards and required or recommended deliverable/document

formats, including:

A cover page that includes:

o Document Name

o Document Version

o Document Date

o Vendor Name

o Governing deliverable requirements addressed within

A document revision history

A Table of Contents

Page numbering on each page that includes the total number of pages in the document.

Reference to applicable source documents (contract, RFP, other project deliverables, etc.)

For multiple files, does the main document list the component files and their purpose

GDE-2 Is the deliverable/document internally consistent? Criteria for evaluating internal consistency

include:

1. Are all statements compatible?

2. Is the document style consistent?

3. Does each given term have the same meaning throughout?

4. Are given item(s) or concept(s) referred to by the same name or description throughout?

5. Is the level of detail and presentation style the same throughout?

GDE-3 Is the document consistent with external documents that it references? Criteria for evaluating

external consistency include:

1. Are all statements compatible?

2. Is the document style consistent across documents?

3. Does each given term have the same meaning across documents?

4. Are given item(s) or concept(s) referred to by the same name or description across

documents?

5. Is the level of detail and presentation style the same across documents, taking into account

inherent differences in the document types?

GDE-4 Is the document understandable? Criteria for evaluating understandability include:

1. Does the document follow generally accepted rules of spelling, grammar, capitalization,

punctuation, symbols, and notation?

2. Are all non-standard terms, phrases, acronyms, and abbreviations defined?

3. Is the document level of complexity appropriate for the intended audience?

4. Is the material being presented such that it can be interpreted in only one way?

Page 9: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

8

Criteria ID Deliverable Evaluation Criteria

GDE-5 Is the document technically adequate? Criteria for evaluating the technical adequacy include:

1. Is the overall approach sound?

2. Does the document violate known facts or principles?

3. Is the technical approach consistent with approaches or best practices known to be

successful on other similar projects?

4. Is the technical approach well researched or based on proven prototypes?

5. Does the document appear well thought out, not thrown together?

6. Does the technical approach make sense both technically and practically?

GDE-6 Is the document complete?

At every stage of document development, all required sections and section headers should be

present. Completeness of paragraph content depends upon when the required information is, or

should be, known. Criteria that may apply, depending on the document, include:

1. Is the document delivery schedule identified? (Pilot, Draft, Release 1, etc.)

2. Is the intent and purpose of the document identified?

3. Is the appropriate document audience identified?

4. Are all appropriate sections of the document present for the delivery and addressed in

adequate detail?

5. Are all required sections and section headers present, regardless of delivery timing?

6. Is an acronym list and glossary included and complete?

7. Does the document indicate the appropriate reviews and approvals?

GDE-7 Does the document contain appropriate content for the intended audience?

GDE-8 Is the document in alignment with established project management and deliverable objectives?

GDE-9 Is the document traceable to indicated documents?

1. Is the deliverable in agreement with any predecessor documents?

2. Does the document fully implement the applicable stipulations of a predecessor document?

3. All material in a subsequent or lower-level document has its basis in the predecessor

document, that is, no untraceable material has been introduced?

GDE-10 Is the document internally traceable?

1. Are all of the sections of the document in agreement with other sections?

2. Does each referenced section contain the information or otherwise fully implement the

applicable stipulations indicated in the reference?

3. Is all of the material in lower-level or detailed sections of the document based on the

predecessor sections, that is, no untraceable material have been introduced?

GDE-11 Is the deliverable/document on schedule?

1. Does the delivery of the deliverable/document conform to the current project schedule?

2. If the delivery of the deliverable document does not conform to the current project

schedule, and the delivery is late, has a valid reason been provided?

3. If the delivery of the deliverable document does not conform to the current project

schedule, and the delivery is late, does the late delivery indicate a risk to the overall project

schedule and deadlines?

3.2 Backup and Disaster Recovery Plan Evaluation Criteria (DCA)

Table 3, Backup and Disaster Recovery Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

BDR-01 Does the plan and procedure provide existing or the to be developed disaster recovery strategies

for the solution?

Page 10: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

9

Criteria ID Deliverable Evaluation Criteria

BDR-02 Do the backup and recovery strategies identify the following:

The location of the solution?

Identify regularly scheduled backups of the solution from all devices? Frequency?

Identify that backups are stored with the same level of security as the original data.

Provide an application recovery time objective for each component of the solution.

BDR-03 Does the plan provide existing or the to be developed data backup and recovery strategies?

BDR-04 Do the data backup and recovery strategies identify the following:

The location of the solution data?

Identify regularly scheduled backups of the solution data from all devices?

Identify that backups are stored with the same level of security as the original data.

Provide a data recovery point objective for data loss.

3.3 Data Conversion Auditing Evaluation Criteria (DCA)

Table 4, Data Conversion Auditing Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

DCA-01 Did the migration result in the expected number of new objects as a result of the automated or

manual migration processes?

DCA-02 Have all data migration errors been resolved?

DCA-03 Has a random inspection of the migrated data been performed that resulted in no identified

migration data errors?

3.4 Data Migration Plan Evaluation Criteria (DMP)

Table 5, Data Migration Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

DMP-01 Does the data migration plan identify the scope of the data migration?

DMP-02 Does the data migration plan identify the data migration requirements?

DMP-03 Does the data migration plan identify the data sources that will need to be migrated and the data

structure for each?

DMP-04 Does the data migration plan identify the one time data migration efforts and the data migration

efforts that need to be repeated periodically?

DMP-05 For the periodic data migrations, does the plan identify how the data migration rules/logic/code

will be mainatained?

DMP-06 Does the data migration plan identify any required data transformations?

DMP-07 Does the data migration plan identify if it is necessary to have the source and the destination

databases running simultaneously?

DMP-08 Does the data migration plan identify any connectivity issues that will be involved in migrating

the data?

DMP-09 Does the data migration plan identify any downtime that will be required during the migration?

DMP-10 Does the data migration plan identify any additional hardware or software required to perform the

migration?

DMP-11 Does the data migration plan identify the data migration testing and acceptance criteria?

DMP-12 Does the data migration plan identify the capture and migration process?

DMP-13 Does the data migration plan categorize the total number of database objects, and identify the

number of objects that can be converted and migrated automatically and those that will need to be

migrated manually?

Page 11: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

10

Criteria ID Deliverable Evaluation Criteria

DMP-14 Does the data migration plan identify how data migration errors will be handled for:

Data that did not migrate successfully

Stored procedures, views, and triggers that did not parse

Syntax that requires manual intervention

Database objects that were not created successfully

DMP-15 Does the data migration plan identify the approach/testing for verifying the resultant migrated

data’s correctness?

DMP-16 Does the data migration plan identify how long the data migration will take and how long it will

take to rectify errors.

DMP-17 Does the data migration plan identify a recovery plan with recovery options for each stage of the

migration?

DMP-18 Does the data migration plan identify any training requirements that will be needed?

DMP-19 Does the data migration plan identify any migration deliverables?

3.5 Design Evaluation Criteria (DES)

Table 6, Design Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

DES-01 Are all functional requirements allocated to a section of the design?

DES-02 Are all of the future state business process workflows implemented in the design?

DES-03 Are all of the approved Fit Gap recommendations implemented in the design?

DES-04 Are the designs consistent?

DES-05 Are the designs complete?

DES-06 Do the designs properly address the development standards?

DES-07 Does the design specification provide clear design objectives and goals for project?

DES-08 Does the design specification provide well established design criteria and stated product design

specifications?

DES-09 Does the design specification present alternative designs that were considered?

DES-10 Does the design specification document use of established design methodologies?

DES-11 Does the design specification adequately describe all of the key features?

DES-12 Does the design specification clearly and accurately describe any design constraints?

DES-13 Does the design specification identify and describe analysis cases, including rationale for each,

including any design constraints

DES-14 Is the design presented in the document feasible?

1. Can the design be implemented given the state of the art, schedule and resource constraints,

available tools and techniques, and other factors affecting the project?

2. Can the design be implemented given dependencies on other project activities?

DES-15 Does the design indicate use of appropriate requirement, design, or coding techniques?

DES-16 Does the design indicate the appropriate level of detail?

Architecture Evaluation

DES-21 Availability: Is the system availability discussed? Does the design address how the system will

satisfy the required up time? This can be addressed in terms of:

Failover

Transaction Manager

Stateless Design

DES-22 Conceptual Integrity: Is the system conceptual integrity discussed? Is there an underlying

theme or vision that unifies the design of the system at all levels.

Page 12: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

11

Criteria ID Deliverable Evaluation Criteria

DES-23 Debug-Ability /Monitoring: Is the system debug-ability /monitoring discussed? Will the system

provide easy and efficient debugging, registration of abnormal behavior, and real-time

monitoring?

DES-24 Ease of Administration: Is the ease of administration discussed? This discussion can include the

infrastructure, tools, and staff of administrators and technicians needed to maintain the health of

the application.

DES-25 Ease of Deployment: Is the ease of deployment discussed?

DES-26 Efficiency: Is the system efficiency discussed? Does the description discuss the use of the

resources available for execution of software, and how this impacts response times, throughput

and storage consumption?

DES-27 Extensibility: Is the system extensibility discussed. Will the system allow new feature

implementation/replacement of components with improved versions and the removal of unwanted

or unnecessary features or components?

DES-28 Functionality: Is the system functionality discussed: Does the system seem to enable users to

perform the work for which it was intended?

DES-29 Interoperability: Is the system interoperability discussed? This can include discussion of

interaction with other sub-subsystem, or well defined access to externally-visible functionality

and data structures, or interactions with other run-time environments.

DES-30 Maintainability: Is the system maintainability discussed? This can include a discussion on

problem fixing, repairing a software system after errors occur, etc.

DES-31 Modifiability: Is the system modifiability discussed? Will the system provide the ability to make

changes to the system quickly and cost effectively? This can be addressed in terms of:

Client-server (separation of concerns) – provide a collection of services from a central

process and allow other processes to use the services.

Independence of interface from implementation – allow substitution of different

implementations for the same functionality.

Separation – separate data and functions?

Encoding – meta-data and language interpreters that provide a mechanism for interpreting

data and simplify modifications to parameters of the data.

Run-time discovery – no hard-coded connection strings, queue names, etc.

DES-32 Performance: Is the system performance discussed? Does the time required to respond to stimuli

(events) or the number of events processed in an interval of time satisfy the system requirements?

This can be discussed in terms of:

Connection pooling

Load balancing

Distributed processing

Caching

Transaction Concurrency

Replication of data

DES-33 Portability: Is the system portability discussed? Does the system have the ability to run under

different computing environments?

DES-34 Reliability: Is system reliability addressed, including identifying how the system will keep

operating over time in the context of application and system errors and in situations of unexpected

or incorrect usage (to perform in a predictable manner)?

DES-35 Scalability: Is the system scalability discussed? Does the system support continuous growth to

meet user demand and business complexity?

Page 13: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

12

Criteria ID Deliverable Evaluation Criteria

DES-36 Security: Is the system security architecture discussed? Does the architecture provide the ability

to resist unauthorized attempts at usage and denial of service? This can be addressed in terms of:

Authorization

Authentication

Auditing

Integrity

Confidentiality

Denial of Service

Data isolation

DES-37 Testability: Is the system testability discussed? How well can the system be tested?

DES-38 Usability: Is the system usability discussed? This should include a discussion on how easy is it to

use.

3.6 Hosting Plan Evaluation Criteria (HSE)

Table 7, Hosting Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

HSE-1 Does the hosting plan describe how the hosting services that will be provided will satisfy the

Hosting Services specified in the OPWDD EHR RFP.

HSE-2 Does the hosting plan describe how the operational services that will be provided will satisfy the

Hosting Services specified in the OPWDD EHR RFP.

HSE-3 Does the hosting plan describe how the maintenance services that will be provided will satisfy the

Hosting Services specified in the OPWDD EHR RFP.

HSE-4 Does the Monthly Service Level Report indicate adequate and appropriate:

Response Time statistics?

Scheduled Maintenance Occurrences?

System Utilization measures?

Service Downtimes?

Downtime Detailed Reports?

HSE-5 Does the hosting plan describe the backup and disaster recovery processes?

HSE-6 Does the hosting plan describe the site security?

HSE-7 Does the hosting plan describe the system security?

3.7 Implementation Plan Evaluation Criteria (IPE) Bidders are required to provide a narrative work plan clearly describing their approach to this project. The Bidder

must state specifically how deliverables, including the optional financial and billing modules, will be achieved by

the Bidder. This work plan should address:

Table 8, Implementation Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

IPE-1 The Bidder’s approach to managing the system implementation and integration work. It is

expected that the successful Bidder will assign an overall project manager for this engagement.

IPE-2 A description of the requirements management process that the Bidder plans to use during

system implementation.

IPE-3 The approach to risk management and what the Bidder considers to be the key risks to the

success of this implementation and how these risks should be addressed.

IPE-4 A description of any proprietary tools, techniques or technologies that the bidding firm uses for

such implementation work.

Page 14: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

13

Criteria ID Deliverable Evaluation Criteria

IPE-5 A description of the Bidder’s staffing level that is anticipated to accomplish the work including

the level of onsite presence.

IPE-6 A description on the use of subcontractors if applicable.

IPE-7 A listing of roles the Bidder expects from State personnel to be involved in the implementation

and description on how these State personnel will be incorporated into the project team.

IPE-8 An outline of the project organization structure which depicts the key individuals and areas of

responsibility.

IPE-9 A plan on how all the parties involved in the implementation and integration effort will be

coordinated.

IPE-10 The timeframe after contract signing that the Bidders’ resources can begin the project and the

implementation start.

IPE-11 The timeline for project tasks, milestones and deliverables that will allow for system rollout in

the timeframe specified by OPWDD.

IPE-12 A. A description of your Project development life cycle and milestones.

IPE-13 B. The work plan for managing the system implementation and integration work.

IPE-14 C. A description of the requirements management process that the Bidder plans to use

during the implementation of the EHR system.

IPE-15 D. A plan for converting data into the Bidder’s EHR System to populate the individual’s

care coordination record as outlined in section 3 of this RFP.

IPE-16 E. The approach to risk management and what you consider to be the key risks to the

success of this implementation and how these risks will be addressed and managed.

IPE-17 F. A method for status reporting and periodic status meetings.

IPE-18 G. The quality assurance measures and the plan that will be utilized to monitor the project

and address issues, both foreseen and unforeseen.

IPE-19 H. A description of all work anticipated in implementing the scope of services, including a

time frame for implementing project components.

IPE-20 I. A project plan identifying activities required for implementation, key milestones and

the implementation timeframe.

IPE-21 J. A description of any proprietary tools, techniques or technologies that the Bidder’s firm

uses for such implementation work.

IPE-22 K A project organization structure and a plan on how all the parties involved in the

implementation and integration effort will be coordinated. This structure and plan should

include:

A description of the Bidder’s staffing level that is anticipated to accomplish the work including

any onsite requirements.

A description on the Bidder’s use of subcontractors if applicable.

A listing of roles the Bidder would expect from State personnel to be involved in the

implementation and description on how these personnel will be incorporated into the project

team.

L. For each project component, identify the title of staff that will be involved and an estimate of

staff time to complete the component.

3.8 Project Plan Evaluation Criteria (PPE)

Table 9, Project Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

PPE-1 Does the Project Plan identify a Work Breakdown Structure (WBS), or where it is located?

Page 15: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

14

Criteria ID Deliverable Evaluation Criteria

PPE-2 Does the Project Plan identify a Project Schedule, or where it is located?

PPE-3 Does the Project Plan identify a Quality Management Plan, or where it is located?

PPE-4 Does the Project Plan identify a Risk Management Plan, or where it is located?

PPE-5 Does the Project Plan identify a Change Management Plan, or where it is located?

PPE-6 Does the Project Plan identify an Acceptance Management Plan, or where it is located?

PPE-7 Does the Project Plan identify an Issue Management and Escalation Plan, or where it is located?

PPE-8 Does the Project Plan identify a Communication Plan, or where it is located?

PPE-9 Does the Project Plan identify an Implementation/Transition (including migration plans) Plan, or

where it is located?

PPE-10 Does the Project Plan identify a Training Plan, or where it is located?

Work Breakdown Structure

PPE-21 Is the WBS broken down starting at the top level tasks?

PPE-22 Is the WBS managed using standard project management software?

PPE-23 Does the WBS identify all tasks on the project?

PPE-24 Does the WBS present meaningful project information at the summary task level?

PPE-25 Do the work packages add up to the summary task?

PPE-26 Is each summary task and work package named for the product that is produced?

PPE-27 Does each summary task and work package produce a product?

PPE-28 Are identified tasks manageable? (greater than 8 hours and less than 80 hours is recommended)

PPE-29 Are the project management activities included in the WBS?

Project Schedule

PPE-41 Is the Project Schedule broken down starting at the top level tasks?

PPE-42 Is the Project Schedule managed using standard project management software?

PPE-43 Does the Project Schedule identify all tasks on the project?

PPE-44 Does the Project Schedule present meaningful project information at the summary task level?

PPE-45 Do the work packages add up to the summary task?

PPE-46 Is each summary task and work package named for the product that is produced?

PPE-47 Does each summary task and work package produce a product?

PPE-48 Are identified tasks manageable? (greater than 8 hours and less than 80 hours is recommended)

PPE-49 Are the project management activities included in the Project Schedule?

PPE-50 Is the Project Schedule maintained and updated during the project at agreed to intervals?

Quality Management Plan

PPE-61 Does the Quality Management Plan identify the project quality management process?

PPE-62 Does the Quality Management Plan identify names and titles of the persons who will have

oversight responsibilities for quality reviews?

PPE-63 Does the identification of the persons with oversight responsibilities, together with the project

organizational chart, provide reasonable assurance that the quality review process will be

objective, unbiased, and accurate?

PPE-64 Does the Quality Management Plan describe the general process for the selection of quality

reviewers?

PPE-65 Does the Quality Management Plan describe the general process for the selection of reviewers

provide reasonable assurance that objective and unbiased individuals will be select to complete

the reviews?

PPE-66 Does the Quality Management Plan describe how the quality reviews will be planned in order to

review as much of the project material as is reasonable?

Page 16: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

15

Criteria ID Deliverable Evaluation Criteria

PPE-67 Does the Quality Management Plan describe how the quality reviews will be held and what

project material will be reviewed?

PPE-68 Does the Quality Management Plan define any standards that the project material will be

reviewed against?

PPE-69 Does the Quality Management Plan describe the general process for maintaining project

information during the review process?

PPE-70 Does the Quality Management Plan process for maintaining project information provide

reasonable assurance that the integrity of the information will be maintained throughout the

project?

PPE-71 Does the Quality Management Plan describe how documents, results, analyses, and all other

materials related to quality reviews will be stored and maintained, including privacy and

program integrity?

PPE-72 Does the Quality Management Plan describe how quality review information will be reported?

Risk Management Plan

RISK-01 Does the Risk Management Plan identify how the risk management activities will be conducted

for the project?

RISK-02 Does the Risk Management Plan identify how risks will be identified and what characteristics

will be documented?

RISK-03 Does the Risk Management Plan identify how risks will be prioritized for futher analysis or

action by assessing and combining their probability of occurrence and impact? (Qualitative Risk

Aanalysis)

RISK-04 Does the Risk Management Plan identify how risks will be numerically analyzed to determine

the effect of identified risks on overall project objectives? (Quantitative Risk Analysis)

RISK-05 Does the Risk Management Plan identify the process of developing options and actions to

enhance opportunities and reduce threats to project objectives?

RISK-06 Does the Risk Management Plan identify how risks will be controlled, including implementing

risk response plans, tracking of identified risks, monitoring residual risks, identifying new risks,

and evaluating the risk process throughout the project.

RISK-07 Does the Risk Management Plan identify the project risk management process?

RISK-08 Does the Risk Management Plan identify the risk identification process?

RISK-09 Does the Risk Management Plan identify how risks will be evaluated?

RISK-10 Does the Risk Management Plan identify how the likelihood of the risks will be estimated?

RISK-11 Does the Risk Management Plan identify how expected severity of impact from the risks will be

determined?

RISK-12 Does the Risk Management Plan identify how a mitigation strategy that minimizes the likelihood

and/or severity of the risks will be determined?

RISK-13 Does the Risk Management Plan identify how risks will be managed? Will a risk register be

utilized?

Change Management Plan

PPE-101 Does the Change Management Plan identify the project change management process?

PPE-102 Does the Change Management Plan identify a change process that ensures that, if a change is

made, it is done in the most efficient way possible, following the established procedures and

ensuring the quality and continuity of the system?

PPE-103 Does the Change Management Plan identify how changes will be identified?

PPE-104 Does the Change Management Plan identify how changes will be properly recorded, classified

and documented?

PPE-105 Does the Change Management Plan identify the change review and approval process?

Page 17: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

16

Criteria ID Deliverable Evaluation Criteria

PPE-106 Does the Change Management Plan identify how changes will be planned, implemented and

tested?

PPE-107 Does the Change Management Plan identify mitigation/backout plans for all changes?

PPE-108 Does the Change Management Plan identify emergency change procedures?

Acceptance Management Plan

PPE-121 Does the Acceptance Management Plan identify the project acceptance management process?

PPE-122 Does the Acceptance Management Plan identify the user acceptance testing roles and

responsibilities?

PPE-123 Does the Acceptance Management Plan identify the acceptance testing strategy?

PPE-124 Does the Acceptance Management Plan identify the acceptance testing scenarios?

PPE-125 Does the Acceptance Management Plan identify the assumptions, constraints, and dependencies?

PPE-126 Does the Acceptance Management Plan identify how acceptance testing issues will be recorded,

managed, resolved, and closed?

PPE-127 Does the Acceptance Management Plan identify the acceptance test schedule?

PPE-128 Does the Acceptance Management Plan identify the acceptance procedure(s)?

Issue Management Plan

PPE-141 Does the Issue Management Plan identify the project issue management roles and

responsibilities?

Does the Issue Management Plan identify the project issue identification process, including the

identification criteria?

Does the Issue Management Plan identify the project issue management process?

PPE-142 Does the Issue Management Plan identify the project issueresolution process?

PPE-143 Does the Issue Management Plan identify the project issue escalation process?

Communication Plan

PPE-161 Does the Communication Plan identify the project communication process?

PPE-162 Does the Communication Plan identify the project communication roles and responsibilities?

PPE-163 Does the Communication Plan identify the communication technology planned for use?

PPE-164 Does the Communication Plan identify contact information for all project participants?

PPE-165 Does the Communication Plan identify the project information distribution methods?

PPE-166 Does the Communication Plan identify sample communications templates?

PPE-167 Does the Communication Plan identify sample report templates?

PPE-168 Does the Communication Plan identify the vendor organization chart?

3.9 Release Management Plans Evaluation Criteria (RPE)

Table 10, Release Management Plans Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

RPE-01 Does the solution release management plan include a description of the solution environment that

the software will be released to?

RPE-02 Does the solution release management plan include a pilot deployment to a production like

environment?

RPE-03 Does the solution release management plan include a pilot deployment that consists of rolling the

software out to a selected group of users for testing?

RPE-04 Does the solution release management plan include a pilot deployment that includes training for

the pilot test group?

Page 18: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

17

Criteria ID Deliverable Evaluation Criteria

RPE-05 Does the solution release management plan include a pilot deployment that includes a rollback

plan and the guidelines for implementing a rollback?

RPE-06 Is the content of the solution release management plan consistent with the rest of the solution

documentation?

RPE-07 Does the solution release management plan define the release criteria that the solution must meet

in order to be considered ready for release?

RPE-08 Are all of the release criteria 1) Specific, 2) Measurable, 3) Attainable, 4) Relevant, and 5)

Trackable?

3.10 Requirements Management Plan Evaluation Criteria (RMP)

Table 11, Requirements Management Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

RMP-01 Does the Requirements Management Plan identify the project requirements management process?

RMP-02 Does the Requirements Management Plan indicate that each requirement will be uniquely

identified?

RMP-03 Does the Requirements Management Plan indicate that each requirement will be a single thought

or statement?

RMP-04 Does the Requirements Management Plan identify a change process that ensures that, if a change

is made, it is done in the most efficient way possible, following the established procedures and

ensuring the quality and continuity of the system?

RMP-05 Does the Requirements Management Plan identify how changes will be identified?

RMP-06 Does the Requirements Management Plan identify how changes will be properly recorded,

classified and documented?

RMP-07 Does the Requirements Management Plan identify the requirements change review and approval

process?

RMP-08 Does the Requirements Management Plan identify how changes will be planned, implemented

and tested?

3.11 Requirements Evaluation Criteria (RE)

Table 12, Requirements Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

Organization and Completeness

RE-1 Is each requirement within the scope of the project?

RE-2 Are all internal cross-references to other requirements correct?

RE-3 Are all requirements written at a consistent and appropriate level of detail?

RE-4 Do the requirements provide an adequate basis for design?

RE-5 Is the implementation priority of each requirement included?

RE-6 Are all external hardware, software, and communication interfaces defined?

RE-7 Have algorithms intrinsic to the functional requirements been defined?

RE-8 Does the requirement document include all of the known customer or system needs?

RE-9 Is any necessary information missing from a requirement? If so, is it identified as TBD?

RE-10 Is the expected behavior documented for all anticipated error conditions?

Correctness

RE-11 Do any requirements conflict with or duplicate other requirements?

RE-12 Is each requirement written in clear, concise, unambiguous language?

Page 19: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

18

Criteria ID Deliverable Evaluation Criteria

RE-13 Is each requirement verifiable by testing, demonstration, review, or analysis?

RE-14 Is each requirement free from content and grammatical errors?

RE-15 Can all of the requirements be implemented within known constraints?

RE-16 Are any specified error messages unique and meaningful?

Quality Attributes

RE-17 Are all performance objectives properly specified?

RE-18 Are all security and safety considerations properly specified?

RE-19 Are other pertinent quality attribute goals explicitly documented and quantified, with the

acceptable tradeoffs specified?

Traceability

RE-20 Is each requirement uniquely and correctly identified?

RE-21 Are all internal cross-references to other requirements correct?

RE-22 Is each software functional requirement traceable to a higher-level requirement (e.g., system

requirement, use case)?

Special Issues

RE-23 Are all requirements actually requirements, not design or implementation solutions?

RE-24 Are the time-critical functions identified, and timing criteria specified for them?

RE-25 Have internationalization issues been adequately addressed?

RE-26 Is there only be one requirement per paragraph? Unique requirements should not be stacked

together.

RE-27 Do the requirements limit their specification to "what" is required, and avoid stating "how" to do

the job?

RE-28 Is the meaning of each requirement easily understood and easy to read?

RE-29 Are the requirements complete? Incomplete lists with "etc.", "and/or", and "TBD" must not be

used.

RE-30 Within the requirements, does every pronoun, ("it" or "its") have an explicit and unmistakable

reference?

RE-31 Is each requirement verifiable by testing, demonstration, review, or analysis?

RE-32 Does each requirement refrain from use of non-quantifiable measures such as flexible, modular,

efficient, adequate, and fast?

3.12 Requirements Traceability Matrix Evaluation Criteria (RTM)

Table 13, Requirements Traceability Matrix Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

RTM-1 Does the RTM include all source requirements?

RTM-2 Does the RTM include all derived requirements?

RTM-3 Do all requirements have their source indicated? (Backward Trace)

RTM-4 Are all requirements allocated to system solution(s)/components or other deliverables? (Forward

Trace)

RTM-5 Is the RTM updated appropriately after each requirements change?

3.13 Site and System Security Plan Evaluation Criteria (SSSP)

Table 14, Site and System Security Plan Evaluation Criteria

Page 20: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

19

Criteria ID Deliverable Evaluation Criteria

SSSP-1 Does the plan describe what security controls are in place at each level (physical, hardware,

operating system, virtual machine, system software, application software), so that a risk

assessment can be performed to assess the risk to the system and data.

SSSP-2 Is the document based on the FedRAMP System Security Plan (SSP) Template available at

https://www.fedramp.gov/resources/templates-3/ ?

SSSP-3 Does the plan indicate that the vendor regularly assesses and testes their security controls?

SSSP-4 For each control defined in the SSSP, does the plan provide the following information:

• Control ID

• Summary of Control

• Responsible Role

• Parameters for the control

• Implementation Status:

– Implemented

– Partially implemented

– Planned

– Alternative implementation

– Not applicable

• Control Origination :

– Service Provider Corporate

– Service Provider System Specific

– Service Provider Hybrid (Corporate and System Specific)

– Configured by Application vendor

– Provided by Application vendor

3.14 System Validation Evaluation Criteria (SVE)

Table 15, System Validation Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

SVE-1 Have all integration tests been executed successfully?

SVE-2 Has the data migration completed successfully?

SVE-3 Have all system acceptance tests been executed successfully?

SVE-4 Have all discovered software problems been documented and resolved?

SVE-5 Have all software changes been appropriately identified?

SVE-6 Have all software changes been appropriately documented?

SVE-7 Have all software changes been appropriately tested?

SVE-8 Have all system acceptance test results been reviewed and accepted?

SVE-9 Have all encountered system errors been evaluated and accepted?

SVE-10 Have the Final Test Results been evaluated and accepted?

SVE-11 Have all data errors been resolved?

3.15 Test Auditing (TA)

Table 16, Test Auditing Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

TAE-1 Were all tests reviewed prior to execution?

TAE-2 Did all tests pass satisfactorily?

TAE-3 Was all test data adequate for the testing?

Page 21: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

20

Criteria ID Deliverable Evaluation Criteria

TAE-4 Were any encountered defects tracked accordingly?

TAE-5 Were any followup tests run successfully?

3.16 Test Plan Evaluation Criteria (TPE)

The Test Plan Evaluation Criteria will measure how well a test plan performs its functions. A test

plan is good to the extent that it satisfies these criteria. Exactly how good is “good enough”

depends on situational factors and judgments.

Table 17, Test Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

TPE-1 Does the test plan effectively serve its intended functions?

TPE-2 Is the test plan accurate with respect to any statements of fact?

TPE-3 Does the test plan make efficient use of available resources?

TPE-4 Does the test plan tolerate reasonable change and unpredictability in the project?

TPE-5 Is the test plan self-consistent and sufficiently unambiguous?

TPE-6 Is the test plan concise, maintainable, and helpfully organized?

TPE-7 Does the test plan meet externally imposed requirements?

TPE-8 Does the test plan indicate and support an effective test planning process?

TPE-9 Does the test plan indicate a testing process that is within the capability of the organization that

must perform the planning, testing, and reporting?

TPE-10 Are the testing methods clearly described and appropriate?

TPE-11 Do the test methods verify the design?

TPE-12 Does the test plan indicate how performance testing will be performed?

TPE-13 Does the test plan indicate how data migration testing will be performed?

TPE-14 Does the test plan indicate how acceptance testing will be performed?

TPE-15 Does the test plan indicate how usability and accessibility testing will be performed?

TPE-16 For Change Orders, the Test Plan will set forth the kind, type, volume considerations and number

of tests to be performed for acceptance of the Change Order.

3.17 Test Strategy Evaluation Criteria (TSTR)

Table 18, Test Strategy Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

TSTR-01 Does the Test Strategy identify the requirements and tests that have been allocated for each

release of the solution?

TSTR -02 Does the Test Strategy provide a cross reference of the System Requirements to Test Cases to

Test Scripts for each solution release?

TSTR -03 Does the Test Strategy provide a guide for the design of Test Cases and Test Scripts by stating the

general purpose of the testing. The statement of coverage strategy can be a simple as, "Verify all

performance requirements" or "Execute every line of code."

TSTR -04 Does the Test Strategy provide guidance for failed test scripts and how they should be addressed

(schedule additional testing, relax test criteria, schedule test case for the next release, etc.)

3.18 Test Case Evaluation Criteria (TCSE)

Table 19, Test Case Evaluation Criteria

Page 22: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

21

Criteria ID Deliverable Evaluation Criteria

TCSE-01 Do the test cases describe test cases, including test objective?

TCSE-02 Do the test cases describe software and hardware components evaluated by the test case?

TCSE-03 Do the test cases describe the test environment?

TCSE-04 Do the test cases identify the requirements being tested?

TCSE-05 Do the test cases define the appropriate user roles applicable to the test?

TCSE-06 Do the test cases describe general test conditions?

TCSE-07 Do the test cases describe required pre-conditions, including required previous successful tests?

TCSE-08 Do the test cases describe expected post conditions?

TCSE-09 Do the test cases include test cases for all expected system functions?

TCSE-10 Are the test methods clearly described and appropriate?

TCSE-11 Do the test methods verify the design?

3.19 Test Script Evaluation Criteria (TSE)

Table 20, Test Script Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

Acceptance Test Script Evaluation Criteria

TSE-01 Do the acceptance test scripts describe the test, including the test objective?

TSE-02 Do the acceptance test scripts describe the software and hardware components evaluated by the

test case?

TSE-03 Do the acceptance test scripts describe the test environment?

TSE-04 Do the acceptance test scripts identify the requirements being tested?

TSE-05 Do the acceptance test scripts define the appropriate user roles applicable to the test?

TSE-06 Do the acceptance test scripts describe the general test conditions?

TSE-07 Do the acceptance test scripts define the required pre-conditions, including required previous

successful tests?

TSE-08 Do the acceptance test scripts define the expected post conditions?

TSE-09 Do the acceptance test scripts include tests for all expected system functions?

TSE-10 Are the acceptance test scripts clearly described and appropriate?

TSE-11 Do the acceptance test scripts verify the design?

Performance Test Script Evaluation Criteria

TSE-21 Do the performance test scripts test the performance requirements as stated in the RFP?

TSE-22 Do the performance test scripts indicate that they will be executed in a production or production-

like environment?

TSE-23 Do the performance test scripts include the test objective?

TSE-24 Do the performance test scripts describe software and hardware components evaluated by the

test case?

TSE-25 Do the performance test scripts describe the test environment?

TSE-26 Do the performance test scripts define the appropriate user roles applicable to the test?

TSE-27 Do the performance test scripts describe general test conditions?

TSE-28 Do the performance test scripts describe required pre-conditions, including required previous

successful tests?

TSE-29 Do the performance test scripts describe expected post conditions?

TSE-30 Are the performance test scripts clearly described and appropriate?

Page 23: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

22

3.20 Test Results Evaluation Criteria (TRE)

Table 21, Test Results Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

TRE-1 Do the Testing Results clearly describe the results?

TRE-2 Do the Testing Results clearly describe how to interpret the results?

TRE-3 Do the Testing Results clearly describe how to determine the perceived validity of the results?

TRE-4 Do the Testing Results clearly state the results and conclusions? The results should be

quantitative where possible and include applicable statistical analyses (mean, standard deviation,

etc.)

3.21 Training Plan Evaluation Criteria (TRPE)

Table 22, Training Plan Evaluation Criteria

Criteria ID Deliverable Evaluation Criteria

TRPE-1 How a train the trainer approach will be incorporated into the training plan.

TRPE-2 The types of additional training being proposed by job function including the method in which

training is offered (i.e.; computer or web-based training, face to face classroom style training

with an instructor, one to one training, or self-study). Bidders should be mindful that

employees within the OPWDD system have a range of technological skills which could range

from limited knowledge and experience with computers to proficient in the use of technology.

Training will need to meet these varying abilities;

TRPE-3 The duration of each class;

TRPE-4 The number of classes that will be offered and/or the number of participants included in each

training;

TRPE-5 How often the training will be offered (as needed, or on a set calendar schedule);

TRPE-6 The recommended number of people that should attend training;

TRPE-7 Identification of who will provide the proposed product training to each discipline;

TRPE-8 A description of ongoing training programs and how such training will be accessed;

TRPE-9 The extent that the Bidder provides a “Help Desk” or technical support to end users;

TRPE-10 A description of how training materials will be made available and accessible.

TRPE-11 Describe the extent to which on-line training modules are provided.

TRPE-12 Describe the use of the “train the trainer” approach.

TRPE-13 The types of training being proposed by job function.

TRPE-14 The manner in which the “train the trainer” training, as well as any other training, will be

presented (i.e., classroom style with an instructor, one-on-one, computer-based training, self-

study, etc.).

TRPE-15 Describe how often training sessions will be offered.

TRPE-16 Describe ownership of training materials and the extent that training materials are available on

an ongoing basis.

TRPE-17 Describe the form and format of training materials.

TRPE-18 The duration of each “train the trainer” class.

TRPE-19 The location of the training (Note: OPWDD requires that training be offered at a minimum of

six locations. See Section titled Training Plan located in Section 3 of this RFP).

TRPE-20 How often training is offered (as needed, or on a set calendar schedule).

TRPE-21 The number of classes or training courses that will be offered.

TRPE-22 The recommended number of people that should attend training.

TRPE-23 Who provides the proposed product training and expectations of OPWDD employees.

Page 24: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

23

Criteria ID Deliverable Evaluation Criteria

TRPE-24 A description of ongoing training programs.

TRPE-25 The extent that the Bidder’s Help Desk support is available to end users.

TRPE-26 Extent to which training materials address OPWDD customizations.

TRPE-27 Proposal for Optional Training Deliverable to provide training to all end users without

leveraging state staff. This is an optional deliverable. Bidders should describe any training

options they can offer in addition to the train-the-trainer approach. The cost evaluation

(Attachment 5) should provide a cost for training using the train-the-trainer model and a

separate cost for any additional training modalities that are available.

TRPE-28 Does the Training Plan identify the project training process?

TRPE-29 Does the Training Plan identify the project training roles and responsibilities?

TRPE-30 Does the Training Plan identify the project training requirements, pre-requisites, assumptions,

and constraints?

TRPE-31 Does the Training Plan identify the project training schedule?

TRPE-32 Does the Training Plan identify the project training agenda?

TRPE-33 Does the Training Plan identify the role based training plan that identifies how the training

will be delivered for each identified role?

TRPE-34 Does the Training Plan identify the required training resources, any required preparation

for performing training, and any training materials that will be used (handouts,

workbooks, etc.)?

TRPE-35 Does the Training Plan identify the process for evaluating the training and the process that will

be followed to improve the training?

Page 25: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

24

Attachment 1:

Deliverable Document Review Report Template

Deliverable Document Review Report

Deliverable Identifier and Name(s):

TBD

Deliverable Document Version: TBD

Actual Date of Delivery: TBD

for the

OPWDD Electronic Health Record Project

Date of Review

Summary of Findings

Recommended Action

File(s) Reviewed

Revision History

Date Version Author(s) Description

Page 26: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

Deliverable ID: TBD

25

APPENDIX 1 EXECUTIVE SUMMARY

This document presents the findings associated with the review of the EHR Deliverable defined on

the cover page.

Acronyms and terminology used within this document are consistent with the acronyms and

terminology used on the EHR project. Definitions of acronyms and terms can be found in project

glossaries and acronym lists.

APPENDIX 1 DETAILED FINDINGS

The findings of all reviewers are listed in the following Table 1, Detailed Findings.

The column titled “Class” contains a value assigned to the severity level of the finding which

indicates the classification column has the following meaning:

Class 1 – Minor Severity Error (e.g., spelling, grammar, formatting).

Class 2 – Modest Severity Error (e.g., inconsistent terminology, vague or confusing

statements).

Class 3 – Significant Discrepancy (e.g., missing content, functionality not as expected,

processes not as expected) – may be raised to program management level for resolution.

Class 4 – Major Discrepancy (e.g., scope dispute, missing functional requirements, major

deviation from expectations) – raise to senior director or sponsor level for resolution.

The findings have two means of location, the deliverable document “Section Number” and “Page

Number”.

The “Source” column indicates the source of the finding.

The findings have a unique Finding # assigned as the findings were compiled – no order of priority

is implied by this #. The findings are numbered consecutively.

Page 27: Deliverable Document Review Plan New York State Office for ... · PDF fileAPPENDIX 1 DETAILED FINDINGS ... Site and System Security Plan Evaluation Criteria ... (EHR) project. 2. DELIVERABLE

OPWDD EHR Deliverable Review Report

Deliverable ID: TBD

26

Table 1, Detailed Findings

Finding

#

Class Section Page7 Source Status Description

TBD

APPENDIX 1 SATISFIED FINDINGS

Previous findings that have been satisfied are listed in the following Table 2, Satisfied Findings:

.

Table 2, Satisfied Findings

Finding

#

Class Section Page Source Status Description

TBD