DTE&A Developmental Test & Evaluation Assessments (DTA)
Transcript of DTE&A Developmental Test & Evaluation Assessments (DTA)
Office of Director for Developmental Test, Evaluation,
and Assessments D(DTE&A)
DTE&A Developmental Test & Evaluation Assessments (DTA)
Mr. Woody Eischens
DTE&A Action Officer (support contractor)
Distribution Statement A. Approved for public release. Distribution is unlimited.
DT&E Assessment (DTA)
DTA Report
~10-15 Pages
DTA or DTSA Memorandum
~ 1-2 Pages
2
Capabilities
Purpose – Scope – Content
DT&E Assessments (DTA)
3
▪ DTA is a concise assessment of system capabilities, DT&E planning, and resources
▪ Written by OUSD(R&E)/DTE&A at key decision or knowledge points in a major
development or acquisition program
▪ DTA is an independent DT&E assessment tailored for senior DoD acquisition decision-
makers to convey the technical state of the system under test at a decision point,
and convey the ability of T&E to inform future decisions
EvaluationTest / M&S
Data InformationResources
ScheduleDecisions
4
DT&E Assessment Scope & Content
DTA Template Shell
5
▪ Cover Sheet - Executive Summary
▪ System Description
▪ Background
▪ Assessment
– T&E Planning and Test Resources
– Capabilities
– Integration
– Survivability
– Sustainment
▪ Conclusions and Recommendations
▪ References - Acronyms
Capabilities
System Description & Background
6
▪ Short paragraphs that provides a brief description of the system, identifies ACAT
level, and summarizes any planning or testing programmatic issues you believe are
important.
▪ Background Example
– The ITEP develops, tests, and qualifies the next-generation turboshaft engines for the (System(s)) aviation
fleets. The ITEP is an Acquisition Category IC program proceeding to MS A. The Joint Requirements Oversight
Council approved the Initial Capabilities Document in April 2012. The program has prepared a draft Capability
Development Document to support the MS A decision. The ITEP program plans to execute a tailored
acquisition program that reduces Government risk by placing high value on competition and creative
contracting strategies. A final contractor down-select decision is the primary output of Technology Maturation
and Risk Reduction phase. The Army completed the Analysis of Alternatives in April 2014 and an Army
Independent Review Team completed the technology readiness assessment in September 2014.
T&E Planning & Resources
7
▪ Paragraph summarizes all T&E planning and test resource concerns with
recommendations to resolve the issues
▪ T&E Planning & Resources Considerations
– TEMP progress in Service or OSD staffing and any significant issues
– Program engagement with the AFIT Scientific Test and Analysis Techniques Center of
Excellence (STAT COE) or SMEs to develop a rigorous and efficient test plan linked with
Evaluation Framework
– Schedule realism for planned DT&E, including sufficient time for post-test analysis,
corrective action, and retest
– Common issues include – funding; lack of test articles; test range availability; shortfalls in
test and/or evaluation capability; lack of threat-representative simulations, targets, or
expendables; use or non-use of MRTFB assets and extreme natural environment test
centers
DTA T&E Planning and Resources Examples
8
Test Planning and Test Resources. The program entered MS B with a
Service-approved TEMP. My staff worked with the program to update the TEMP after
source selection to expand on the Developmental Evaluation Framework,
cybersecurity test strategy, and reliability growth planning. I approved the updated MS
B TEMP in March 2015. An updated TEMP to support the MS C decision is in USAF
Service staffing. I reviewed the MS C TEMP and assessed it as adequate to evaluate
system functionality and all critical technical parameters prior to (xxxx) Initial
Operational Test
Test Planning and Test Resources. My office approved a Cybersecurity Survivability
Annex to the (xxxx) TEMP on January 31, 2019. As of June 2, 2020, the program
completed 86 percent of their DT&E test points. Only the remaining MT-2
configuration and MCS 3.1 Operational Build software regression points remain to be
completed before IOT&E
System Capabilities
9
▪ Paragraph summarizes the inherent, system-specific, technical performance
capabilities
▪ For programs that have not started testing, analysis is focused on technology or
subsystem risk reduction progress and issues
▪ Where adequate data are available, results are compared to KPP and KSA
requirements. Articulate what worked well, any significant performance issues and
the mission capability impacts
▪ Capability Considerations
– Specify if the performance requirements are not evaluatable. Assess partially met
requirements as “Not Met” and caveat the results (i.e., “...program met the minimum range
requirement but has not demonstrated the maximum range requirement.”)
Integration
10
▪ Summarizes the system’s ability to integrate or interoperate with people, external
networks, platforms, or Systems-of-Systems
– Provide a general characterization of integration by identifying what major platforms the system
will directly interface with and key interfaces.
– Identify which KPP and KSA integration or interoperability requirements were Met or Not Met.
Articulate what worked well, any significant issues and the mission capability impacts.
– Discuss any issues or concerns with system authorizations to Test, Connect, or Certify.
▪ Integration Considerations
– Describe the state of integration for the system under test – what has been demonstrated for
networks, platforms, and/or other systems, and what are the significant issues identified.
– Describe the plans for IATT, and/or Authorization to Connect (ATC)
– Include any Human-System Integration (HSI) issues or demonstrated capabilities
Integration Examples
11
Interoperability. The xxxx system demonstrated all five levels of interoperability as defined in
NATO STANAG 4586. The xxxx received an ATO from xxxx Cyber Command that expires in March
2020. The xxxx has limited interoperability with the xxxx. The onboard Mission Control Station
(MCS) that operates the xxxx can accept or transfer navigational data with the xxxx platform, but
cannot transfer xxxx EO/IR Full-Motion Video or metadata to the xxxx Communications Center. The
xxxx plans to update xxxx software later in 2017. The xxxx does not meet the EO/IR data KSA
requirement. These deficiencies will limit user ability to exploit EO/IR products after a mission.
The program office plans to correct significant xxxx motion and still imagery issues in software
increment xxxx, and test the fixes later in 2017.
Interoperability. I assess that the xxxx carries high risk toward meeting its communications KPP
and KSAs. The xxxx issues are due to configuration and connectivity issues in external National
Command and Control Communications networks. The problem originates on the enterprise side,
and the xxxx has submitted Mission Issue Papers to the appropriate agencies. The program has
identified workarounds to facilitate IOT&E.
Sustainment
12
▪ Summarizes the evaluation of system sustainability as specified in the TEMP and
DEF. Include Reliability, Availability, Maintainability (RAM) or other sustainment
related requirement measures identified in draft or approved JCIDS documents
▪ Sustainment Considerations
– Provide a characterization of program reliability problems by providing a general
distribution of failures observed
– Discuss the sustainment impacts of demonstrated results to date, or of any issues
– Discuss any other sustainability related requirements (e.g. logistics, training) and
demonstrated results to date
Sustainment Examples
13
Reliability, Availability, and Maintainability. The demonstrated reliability and availability of the
xxxx system met the operational availability KPP and KSA requirements with the exception of the
MCS and xxxx unscheduled maintenance rate. There is a safety issue associated with a xxxx
Airborne Subsystem single-point of failure risk. If the xxxx subsystem fails, the xxxx must divert to
a shore-based landing area or ditch. The xxxx must accept or mitigate this risk, or prohibit xxxx
operations
Reliability. Availability. and Maintainability. One KPP and two KSAs characterize xxxx RAM
performance. The xxxx program measures Mean Time Between Critical Failures (MTBCF) to assess
system reliability and a threshold of xx hours was set by the MS B ADM as the EMD exit criteria.
As of July 31, 2019 the program accumulated xx flight hours with xx critical failure events. The
data are xx observed MTBCF, and xx hours corrected MTBCF. The observed and corrected MTBCF
exceeds the threshold and are trending positive. In addition, fault detection, fault isolation and
false alarm rates have also all shown improvement in the most recent software build. I assess the
program meets MTBCF EMD exit criteria
Survivability
14
▪ Summarizes the evaluation of the system’s ability to survive in threat environments
▪ Survivability includes – Cybersecurity, Live Fire (kinetic survivability), electronic threats, and
other threat environments (i.e. Nuclear Hardness and survivability, and CBRNE)
▪ Survivability Considerations
– Identify if the Cybersecurity requirements can be evaluated
– Use of the Cybersecurity T&E guidebook, or service equivalent to structure T&E
– Method for identifying and prioritizing risks based on mission impacts from cyber effects
(Mission Based Cyber Risk Assessment)
– Discuss program coordination with DOT&E for
o Integrated Cybersecurity testing
o Live fire coupon testing and full-up system level testing or waiver status, and alternate plan
CBRNE – Chemical, Biological, Radiation, Nuclear, & Explosives
Survivability Example
15
Cybersecurity. The 47th Cyberspace Test Squadron (CTS) is the primary test
organization. Phases 1 and 2 are complete. Phase 3, Cooperative Vulnerability
Identification (CVI), is currently in progress using the SIL. Test objectives for CVI
include enumeration, scanning ports, e-tool connection port evaluation, and Data
Transfer Module (DTM) assessment. The 47th CTS has broken CVI into four separate
test periods, two of which are complete. The final CVI test report is expected in first
quarter FY2020. …
Live Fire. Coupon tests of a full-scale fuel tank demonstrated the xxxx capability to
withstand the hydrodynamic effects from VOLT-specified small arms caliber rounds.
Additional tests are planned to assess dry bay fire vulnerability from kinetic threats
identified in the VOLT
Exit Criteria Discussion
16
▪ DT&E-related ADM entry/exit criteria should be addressed in each of the other
assessment areas
▪ Exit Criteria should be program specific, clear, and measurable
17
DTA vs DTSA Memorandum
DTSA Statutory Requirements
18
Section 838 of Public Law 115-91 (NDAA for FY18)
▪ Requires a Developmental Test and Evaluation (DT&E) Sufficiency Assessment (DTSA)
for Major Defense Acquisition Programs (MDAPs) at MS B and MS C
▪ Modifies Title 10, United States Code:– Section 2366b(c)(1) now requires MS B Brief Summary Report to Congress to include summary of
MS B DT&E sufficiency assessment
– Section 2366c(a)(4) now requires MS C Brief Summary Report to Congress to include summary of
MS C DT&E sufficiency assessment
▪ States the minimum content for DT&E Sufficiency Assessments
▪ DTSA written by:– Senior official responsible for DT&E within the Office of the Secretary of Defense (OSD) for ACAT ID
– Senior official responsible for DT&E within the Service for ACAT IB or IC
DTA Regulatory Requirement
19
DoDI 5000.89 Test and Evaluation, November 19, 2020
▪ DT&E Assessment Reporting
– “…For ACAT 1B/1C programs on the T&E oversight list for which USD(R&E) did not conduct a DT&E
sufficiency assessment, the USD(R&E) will provide the MDA with a program assessment at the
development RFP release decision point and MS B and C. This will be updated to support the
operational test readiness review or as requested by the MDA or PM.”
▪ DT&E Assessment Content
– The assessment will be based on the completed DT&E and any operational T&E activities completed
to date
– The DT&E assessment will address [at a minimum]
o The adequacy of the program planning
o Implications of testing results to date
o Risks to successfully meeting the goals of the remaining T&E events in the program
DTA vs DTSA
20
▪ Different purpose and audience, but similar DT&E basis
– DTSA is statutory for MS B & C; DTA is regulatory for RFP, MS B & C, OTRR +...
▪ Both require evaluation of:
– Adequacy of T&E planning
– Implications of T&E Results to date
▪ Minimum content:MS B DT&E Sufficiency Assessment
• DT&E Planning & Resources
• DT&E Schedule (analogous system comparison)
• DT&E Risks and Production Concurrency
• Developmental Test Criteria for Entering LRIP
MS C DT&E Sufficiency Assessment
• DT&E Planning & Resources for Remaining DT
• DT&E Completed
• Risks to Production and Deployment
• Readiness for IOT&E
DT&E Assessment
• DT&E Planning & Resources
• System Capabilities
• System Integration
• System Survivability
• System Sustainment
• Implications for decision
Preparation and Approval Timeline
21
Program
Engagement
Tentative Data
Cut Off Date and
Report Drafting
Draft
Report
Complete
Director, Tech
Staff, & PM
Concurrent
Review Complete
Final
Report to
D,DTE&A
D,DTE&A
ApprovalDecision
Continuous D - 60 Days D - 50 Days D - 45 Days D - 35 Days D - 30 DaysDecision (D) -
Day
DTA Memorandum Template
22
▪ Purpose
▪ Assessment
– Supporting bullet points from DTA conclusions
▪ Recommendation
▪ POC Information
DTA Memorandum
~ 1-2 Pages
DTA Memorandum Assessment
23
▪ A concise statement of the DTA bottom line conclusion and relevant impact to
the milestone or decision being made.
▪ Example (Milestone C decision)
– Assessment: I assessed the (Program) capabilities and system maturity, and
concluded the system will require configuration changes before initial fielding. My
analysis is based on demonstrated capabilities and identified deficiencies from
completed T&E as of (Date). The following highlights form the basis of my
assessment:
▪ Include supporting bullet points from the DTA assessment areas that directly
support the assessment conclusion and any recommendation(s).
DTA Memorandum Recommendation
24
▪ A concise statement of any recommendation that would mitigate or address
issues identified in the assessment.
▪ Example
– Purpose: I recommend mitigating the configuration change risks by making future
incremental production lot buys contingent on knowledge gained from (key event)
T&E of the (System) with (deficiency correction).
▪ Do not recommend a particular decision outcome (e.g. proceeding with
production).
DTSA Memorandum Template
25
▪ Purpose
▪ Assessment & Basis
– DT&E Planning
– DT&E Schedule
– DT&E Resources
– Concurrency Risks
– Entrance Criteria
– Additional Info
▪ POC Information
DTSA Memorandum
2+ Pages
MS B DT&E Sufficiency Assessment Process
MS B TEMP
MS B DT&E Sufficiency AssessmentMS B Brief Summary Report to
Congress (submitted by MDA)
Developmental Evaluation Framework (DEF)
Links Key Elements of the TEMP
for DT&E:
• Decision Support / Schedule
• Evaluation results to support
DT&E Sufficiency Assessment
• Test events / phases
Summary of:
• DT&E Sufficiency
• Program cost
• Program schedule
• ITRA
• etc.
Contains:
• DEF
• Schedule
• Resources
• Plans
• Risks
• Entrance/Exit Criteria
(MS B) MDAP DT&E Sufficiency Assessment of:
• DT&E plans
• DT&E schedule
• DT&E resources
• DT&E Risks and production concurrency
• Developmental test criteria for entering LRIP
Program Office Program Office
USD(R&E) or Senior Service DT&E OfficialUSD(A&S) or Component Acquisition Executive
Planning
Focused
26
MS C DT&E Sufficiency Assessment Process
MS C TEMP
MS C DT&E Sufficiency AssessmentMS C Brief Summary Report to
Congress (submitted by MDA)
Developmental Evaluation Framework (DEF)
Links Key Elements of the
TEMP for DT&E:
• Decision support / Schedule
• Evaluation results to
support DT&E Sufficiency
Assessment
• Test events / phases
Assessment
of DT&E
Results
(to date)
Summary of:
• Sufficiency assessment of completed
DT&E
• Program cost
• Program schedule
• Planned dates for IOT&E
• etc.
(MS C) MDAP DT&E Sufficiency Assessment of:
• DT&E completed
• DT&E Plans (for remaining DT&E)
• Risks to Production and Deployment
• DT&E Resources (for remaining DT&E)
• Readiness for IOT&E
Contains:
• DEF
• Schedule
• Resources
• Plans
• Risks
• Entrance/Exit Criteria
• Capabilities
• Integration
• Survivability
• Sustainment
Program Office Program Office
USD(R&E) or Senior Service DT&E Official USD(A&S) or Component Acquisition Executive
USD(R&E) for ACAT ID
Results
Focused
27
Summary
28
DT&E Assessments inform key decisions or knowledge points
– right information at the right time.
Well-written DTA characteristics
▪ Accurate, Unbiased, and Balanced – Non-advocate perspective
– Neither an antagonist nor a protagonist of the system; unemotional; provide both capabilities and
limitations
▪ Concise
– a clear, succinct writing style; minimize insignificant details
▪ Reflects critical thinking – Not A Summary of Test Results
– Presents evaluation of test results and substantiates conclusions; includes the ‘so what?’ for the
decision and mission impact(s)
▪ Inform next actions – ‘Now What?’
– Not just issues; includes recommendations of realistic actions that could be taken to mitigate issues
and improve the system/program
29
Open Discussion and Questions