Bed Management Solution (BMS) - Lori Y. Woods |...
Transcript of Bed Management Solution (BMS) - Lori Y. Woods |...
Bed Management Solution (BMS) October 2013
System Performance Report Version 1.0
Bed Management Solution (BMS)
System Performance Report
October 2013
Prepared by Harris Corporation
CLIN 0003AD
Bed Management Solution (BMS) October 2013
System Performance Report ii Version 1.0
Revision History
Creation Date
Version No.
Description/Comments Author(s) Reviewer(s) Review
Type Issue Date
09/24/2013 1.0 Initial baseline. L. Woods 10/07/2013
This document contains information and/or data for use in support of EVEAH Bed Management Solution (BMS). Content
generated and derived for this document is intended, and applicable, for both applications and project efforts.
Bed Management Solution (BMS) October 2013
System Performance Report iii Version 1.0
Table of Contents
1 General Information ............................................................................ 1
1.1 Purpose .............................................................................................................. 1
1.2 Scope ................................................................................................................. 1
1.3 Roles and Responsibilities ................................................................................. 2
1.4 Simulated Production System Overview ............................................................ 2
1.5 Acronyms and Glossary ..................................................................................... 4
2 System Performance Measuring ........................................................ 4
2.1 Benchmarks ....................................................................................................... 4
2.2 System Monitoring Tools .................................................................................... 4
2.3 Traffic Models ..................................................................................................... 5
3 System Performance Reporting ......................................................... 5
3.1 Performance Data Collecting ............................................................................. 5
3.2 Performance Data Analysis ................................................................................ 5
3.3 System Performance Report Form ..................................................................... 5 3.3.1 Availability ............................................................................................................................. 6 3.3.2 Response time ...................................................................................................................... 8 3.3.3 Simultaneous user handling ................................................................................................ 11 3.3.4 VAMC sites supported by the BMS application .................................................................. 13 3.3.5 Business Transaction Time Distribution Graph ................................................................... 15 3.3.6 Business Transaction Defect Pareto Graph ........................................................................ 18
4 Related Documentation .................................................................... 18
Bed Management Solution (BMS) October 2013
System Performance Report iv Version 1.0
List of Tables
Table 1 – Roles and Responsibilities ............................................................................................................ 2 Table 2 – Acronyms and Glossary ................................................................................................................ 4
List of Figures
Figure 1 - System Overview .......................................................................................................................... 3 Figure 2 - Maximum of 152 User Sessions on BMS ..................................................................................... 6 Figure 3 - BMS Web Front Ends .................................................................................................................. 7 Figure 4 - BMS Web Front Ends .................................................................................................................. 8 Figure 5 - BMS Web Database ..................................................................................................................... 9 Figure 6 - BMS Web Client Web Services .................................................................................................. 10 Figure 7 - MDWS Server Web Services ..................................................................................................... 11 Figure 8 - BMS ServiceHost Server Web Services...................................................................................... 12 Figure 9 - BMS ServiceHost Web Services ................................................................................................. 13 Figure 10 – BMS ServiceHost Server Web Service .................................................................................... 14 Figure 11 - Business Transaction Time Distribution Graph ...................................................................... 15 Figure 12 - Business Transaction Time Measurements Table .................................................................... 16 Figure 13- Business Transaction Time Measurements and Percentiles Table ........................................... 17 Figure 14 - Business Transaction Time Percentiles Table (Cont.) ............................................................. 18 Figure 15 - Business Transaction Defect Pareto Graph ............................................................................ 18 Figure 16 - Business Transaction Defect Measurements Table ................................................................. 18
Bed Management Solution (BMS) October 2013
System Performance Report 1 Version 1.0
1 General Information Modernizing and enhancing the Bed Management Solution (BMS) system aligns the Department of
Veteran Affairs (VA) with its Initiative 8 – Enhance the Veteran Experience and Access to Healthcare
(EVEAH) program by leveraging technology to enhance staff awareness of patient care status and
manage patient flow.
An advanced real-time system is needed to support the management of beds in VA Medical Centers
(VAMCs). Improving bed management has been identified as a critical enabler of patient flow. Efficient
bed management verifies maximum utilization of existing bed capacity, increases patient throughput by
decreasing waiting times, and allows for a smooth transition of patients from the Emergency Department
and Surgery to inpatient beds. IT facilitates efficient patient flow operations and provides reports on the
performance of bed management activities. This intelligence enables VA facility and the Veterans
Integrated Service Network (VISN) to track Key Performance Indicators and meet the Deputy Under
Secretary for Health (DUSH) guidelines.
1.1 Purpose
The purpose of the System Performance Report is to establish system performance capabilities for the
BMS environment, as well as list the performance monitoring tools that can be used to gather those
capabilities. These performance capabilities may consist of, but are not limited to, internal system
response time per individual request, overall simultaneous request handling, connection quantity
handling, expected system utilization maximums, disk storage constraints, VistA server response time and
present bandwidth limitations, and hardware requirements at local workstations to access the system,
which may include Central Processing Unit (CPU) utilization, memory utilization, network throughput,
and disk utilization. This document, combined with the System Performance and Capacity Metrics
document develops collection and publishing procedures related to system capacity and health. Initial
system performance measurements, or benchmarks, are gathered during the simulated production (test)
environment and the initial stages of the actual production environment. These benchmark readings are
used to make adjustments to the system to improve system performance. After additional features or
system changes are implemented within the BMS application, system performance measurements are
collected and reported, as are available and have been defined thus far, to develop new baseline
measurements.
1.2 Scope The scope of the report is to gather the system performance capabilities and to discover what hardware,
network, and server demands are expected to help define minimum system requirements. All data
discovered is quantified and reported, which is then used to form benchmark readings, data collection sets
and traffic models. The data collection sets and traffic models are formulated into a system performance
report. The performance capabilities / benchmarks establish performance measurements of the system that
are used to evaluate the system after a change is implemented to the system or the application so that
adjustments may be made to improve operation of the system and application.
Bed Management Solution (BMS) October 2013
System Performance Report 2 Version 1.0
1.3 Roles and Responsibilities
The following table shows the active roles, responsibilities, and assigned tasks on the BMS project.
Table 1 – Roles and Responsibilities
Role Responsibilities
Analyst Requirements analysis, high level design and documentation. Technical analysis and documentation for other activities.
Configuration Manager Responsible for controlling and managing all artifacts produced by the project teams.
Database Administrator Responsible for database installation and changes as a result of the development including database upgrades, migrations, and data patches (scripts that correct data problems in a given environment).
Developers Responsible for software development and unit testing of the technical solution and supporting technical documentation.
Functional Analyst Analysis of clinical workflow, terminology, and algorithm verification and creation.
Project Manager Executive oversight of the BMS program, advisor to the Task Order Manager and Senior client relations activities.
Process and Product Quality Assurance
Conducts product and process quality assessment activities.
Program Manager Executive oversight of the BMS program regarding contractual or financial concerns, advisor to the Task Order Manager and Senior client relations activities.
Project Planner Creates and maintains project plans in MS Project 2007.
Release Manager Reviews all patch artifacts/interfaces and have the final approval in the Release Process.
Requirements Manager Oversight for requirements gathering and processing.
Scrum Master Leads the daily scrums and act as servant leader to the scrum teams.
Software Quality Assurance
Conducts software quality assessment activities.
System Administrator Responsible for the operating system administration of the server environments.
Technical Architect Responsible for the technical solutions implemented in the patches and ensuring that all patches take into account the other work being done on BMS and any other products that interface with BMS. Provide direction and continuity for the technical solution.
Technical Writer Develops, reviews, edits, and updates the documentation needed by projects and tasks.
Test Engineer Responsible for verifying that the documented functionality works as intended and that the results are documented. Testing includes functional system tests, 508 tests, performance tests, and more.
1.4 Simulated Production System Overview
The Simulated Production System Overview is based on the BMS System Design Document (SDD). The
SDD is based on information supplied by VA regarding the architecture for the production environment
within the Austin Information Technology Center (AITC).
Prior to any rollout of upgrades to a VA production environment, there is a testing and acceptance process
completed on an established simulated production system. This simulated production system facilitates
performance and capacity testing by serving as a functional model of the environment where upgrades are
subsequently deployed.
Bed Management Solution (BMS) October 2013
System Performance Report 3 Version 1.0
The simulated production system shall include PC workstations typical of those used to access the BMS
system; local servers and firewalls similar to those found at the facility; as well as web servers,
application servers, and SQL servers operating under access demands simulating what is encountered
when accessing Veterans Health Information Systems and Technology Architecture (VistA).
The BMS System is split on several infrastructure logical levels, as shown in Figure 1. The following
diagram depicts the Pre-Production and Production environments. The Development, Software Quality
Assurance (SQA), and Live Quality Assurance (QA) environments will be identical.
Figure 1 - System Overview
Bed Management Solution (BMS) October 2013
System Performance Report 4 Version 1.0
1.5 Acronyms and Glossary
Table 2 – Acronyms and Glossary
Term Definition
AITC Austin Information Technology Center
APM Application Performance Management/Monitoring
BMS Bed Management System
CPU Central Processing Unit
DUSH Deputy Under Secretary for Health
EVEAH Enhance the Veteran Experience and Access to Healthcare
IT Information Technology
QA Quality Assurance
SDD System Design Document
SQA Software Quality Assurance
VA Department of Veterans Affairs
VAMC Veteran Affairs Medical Center
VISN Veterans Integrated Service Network
VistA Veterans Health Information Systems and Technology Architecture
2 System Performance Measuring
2.1 Benchmarks
Benchmarks or performance markers can consist of, but are not limited to, transactions per second,
system CPU/disk/memory usage, and network throughput. The benchmarks or performance testing of the
BMS system is developed during the testing/pre-production phase of the BMS project. Benchmark
readings are developed post production and following the introduction of additional system or program
features or changes to have the correct performance markers for the current system.
2.2 System Monitoring Tools System monitoring tools for gathering the information in this document are installed, administered and
maintained at, and by, the AITC. Data Collection Sets
Data collection sets organize multiple data collection points into a single component that is used to review
or log system performance. These data collection sets have the ability to be configured to generate alerts
when a predefined threshold is reached, such as memory utilization or network throughput. System data
collection consists of things such as:
Disk Usage - this tracks the growth of database and log files and provides file-related statistics.
Server Activity - this provides overview of server activity, resource utilization and contention.
Query Statistics - this gathers data about query statistics and individual query text and plans.
There is a dependency on the architecture that is put into place at, and by, the AITC, which further
determines how these and other data collection sets may be collected.
Bed Management Solution (BMS) October 2013
System Performance Report 5 Version 1.0
2.3 Traffic Models
Traffic models can be implemented to capture the capabilities of the network load and produce
predictions of system performance given certain factors. There is a dependency on the architecture that is
put into place at, and by, the AITC, which determines the limits of the network load on the system. The
installation of the equipment that supports this architecture is administered and maintained by VA.
3 System Performance Reporting
3.1 Performance Data Collecting
There is a dependency on the architecture that is put into place at the AITC, which further determines
how the data analysis process occurs.
3.2 Performance Data Analysis
There is a dependency on the architecture that is put into place at the AITC, which further
determines how the data analysis process occurs. Data has been collected and analyzed. It should
meet the following goals: Real performance data on BMS Class 1 from the field. It is understood
that the priority data is to support the contractual operational requirements, specifically those that
are listed below:
o 90% availability
o 2 second response time (as measured within the system itself)
o 7,700+ simultaneous user handling (during AM peak times), with equivalent
simultaneous read/write/etc. transaction support
o 153 VAMC sites supported by the BMS application
o Capable of handling 616,000+ transactions per day
3.3 System Performance Report Form
This section includes formal performance report forms and/or documentation with related test results.
There are currently 31 VAMCs cut over to BMS Class I.
Bed Management Solution (BMS) October 2013
System Performance Report 6 Version 1.0
Figure 2 - Maximum of 152 User Sessions on BMS
Below are graphs illustrating BMS Class I system performance:
NOTE: The system is capable of handling 616,000+ transactions per day.
For the purposes of this document a defect is defined as follows:
A defect with regard to the end user experience monitoring tool means any of the following:
The transaction was slow, i.e. it broke the threshold that has been set.
The transaction resulted in a client request error.
The transaction resulted in a server response error.
3.3.1 Availability
The availability dashboard shows synthetic testing results of the BMS system. Including calls to the
frontend web application along with tests on availability of the backend services (WSDLs).
Figure 3 presents the three core graphs indicating the availability of BMS from 8/25- 9/24.
Bed Management Solution (BMS) October 2013
System Performance Report 7 Version 1.0
Figure 3 - BMS Web Front Ends
Availability –Reports either a 1 or a 0. 1 = Successful call was made , 0 = synthetic test failed
Response Code – Represents the http response code returned from the synthetic test. A “200” is
expected as an HTTP OK. Anything >/=500 is a server error returned from the synthetic test.
Response Time – Application response time shown in milliseconds.
Bed Management Solution (BMS) October 2013
System Performance Report 8 Version 1.0
3.3.2 Response time
The dashboards in the following figures show the 5 core metrics; Average Response Time, Responses Per
Interval, Concurrent Invocations, Errors Per Interval and Stall Count, that are returned for application
components from Application Performance Management/Monitoring (APM).
Average Response time – Response time averages for monitored components shown in
milliseconds.
Responses Per Interval – Application response load for an application component. Shows the
number of times components are completed in an interval.
Concurrent Invocations – Shows the number of concurrent components that are “in flight”
during the interval.
Errors Per Interval – Shows response time errors per interval including application exceptions
and components that take longer than 30 seconds to respond (stall).
Stall Count – Response time applications, during a reporting interval, that take longer than 30
seconds to respond.
Figure 4 presents the five core graphs for the BMS Web Front Ends and is an indicator of Response Time
performance.
Figure 4 - BMS Web Front Ends
There are currently 31 VAMCs cutover.
Bed Management Solution (BMS) October 2013
System Performance Report 9 Version 1.0
Figure 5 presents the five core graphs for the BMS Web Database time and is an indicator of Response
Time performance.
Figure 5 - BMS Web Database
Bed Management Solution (BMS) October 2013
System Performance Report 10 Version 1.0
Figure 6 presents the five core graphs for the BMS Web Client Web Services and is an indicator of
Response Time performance.
Figure 6 - BMS Web Client Web Services
Bed Management Solution (BMS) October 2013
System Performance Report 11 Version 1.0
3.3.3 Simultaneous user handling
Figure 7 presents the five core graphs for MDWS Frontends and is an indicator of user handling time
performance.
Figure 7 - MDWS Server Web Services
Bed Management Solution (BMS) October 2013
System Performance Report 12 Version 1.0
Figure 8 presents the five core graphs for BMS Service Host Front Ends and is an indicator of user
handling time performance.
Figure 8 - BMS ServiceHost Server Web Services
Bed Management Solution (BMS) October 2013
System Performance Report 13 Version 1.0
3.3.4 VAMC sites supported by the BMS application
Figure 9 presents the five core graphs for BMS ServiceHost Web Services and is an indicator of the
VAMC sites supported by the BMS application.
Figure 9 - BMS ServiceHost Web Services
Bed Management Solution (BMS) October 2013
System Performance Report 14 Version 1.0
Figure 10 presents the five core graphs for the WinServiceHost Front Ends and is an indicator of the
VAMC sites supported by the BMS application.
Figure 10 – BMS ServiceHost Server Web Service
Bed Management Solution (BMS) October 2013
System Performance Report 15 Version 1.0
3.3.5 Business Transaction Time Distribution Graph
The information presented in the following figures details the BMS transaction time distribution during
the month of September 2013. The transactions are categorized by Business Service type. This graph
displays Median, Average, Specification, and Range for each Business Service.
Figure 11 - Business Transaction Time Distribution Graph
Bed Management Solution (BMS) October 2013
System Performance Report 16 Version 1.0
Figure 12 includes information on Total Transactions, Maximum, Minimum and Data Points for each
Business Service.
Figure 12 - Business Transaction Time Measurements Table
Bed Management Solution (BMS) October 2013
System Performance Report 17 Version 1.0
Figure 13-includes information on Total Transactions, Maximum, Minimum, Data Points, Percentiles,
Data Span and Averages for each Business Service.
Figure 13- Business Transaction Time Measurements and Percentiles Table
Bed Management Solution (BMS) October 2013
System Performance Report 18 Version 1.0
Figure 14 includes Percentiles, Data Span and Averages information for each Business Service.
(Data unavailable for this period.)
Figure 14 - Business Transaction Time Percentiles Table (Cont.)
3.3.6 Business Transaction Defect Pareto Graph
The information presented below in the following figures details the defects experienced by the BMS
program during the month of July 2013. Defects are categorized based on Business Transaction type.
This graph displays the distribution of detected defects.
(Data unavailable for this period.).
Figure 15 - Business Transaction Defect Pareto Graph
(Data unavailable for this period.)
Figure 16 - Business Transaction Defect Measurements Table
4 Related Documentation Related or relevant documentation as applicable during the execution of the project:
CLIN 0002AV; System Performance and Capacity Metrics Report; Init8_BMS_PCMetrics
CLIN 0002AH; Hardware Specifications; Init8_BMS_HWSpec
CLIN 0002AN; System Test Plan; Init8_BMS_MTestPlan
CLIN 0002AP; Test Defect Report; Init8_BMS_DefectLog
Baseline data from the current implementation of BMS @ AITC; as provided by VA