U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David...

21
U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director Deputy to the Commander ITEA System of Systems Conference Reducing Risk in 2020 through Test & Evaluation 28 January 2015

Transcript of U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David...

Page 1: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

U.S. Army Test and Evaluation

Command

Mr. David Jimenez

Executive Technical Director –

Deputy to the Commander

ITEA System of Systems Conference

Reducing Risk in 2020 through

Test & Evaluation

28 January 2015

Page 2: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

ATEC: Reducing Risk in 2020

through Test & Evaluation

Optimizing T&E Data

• Obtaining and Retaining Information

• Rapid Analysis

• Greater Breath of Analysis

• Advanced Cybersecurity T&E

2

iknnn

i

ttinnnttn

ikn

i

ttintn

ikn

i

ttin

ikn

i

ttin

ikn

i

ttin

eeinnni

nnneen

eeini

ne

eeini

n

eeini

n

eeini

ntR

1544

22544335

44

22435

33

32323

22

32322

11

111

0

1)(

544

544)1(

5

0

1)(

4

4

0

1)(

3

3

0

1)(

2

2

0

1)(

1

1

!!

!1

!!

!

!!

!

!!

!

!!

!)(

Gartner’s 2015 Trends:

#4 – Advanced,

Pervasive, and Invisible

Analytics

Page 3: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

2025 Test & Evaluation and Big Data

Goals:• Utilize knowledge, information, and data to

achieve core mission and business objectives.

• Faster, more Accurate Decision Making

• Cost Optimization

• Quicker Responses to Requests for Information

• More Holistic Test and Evaluation

• Automated tracking items or status

• Make useful big data capabilities available to everyone, but tailored to specific needs.

3

Sustainment of data for long term use

(Archival)

Discoverability and Access to data

Analytics of historical and current information

Derive context to inform decision

making

Common Core Requirements:

2025 T&E

Leveraging Historical Data

Faster, More Sophisticated

Analytical Tools

Modeling & Simulation

Design of Experiments

Cloud Computing

Page 4: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Use Case 1 – Evaluation Context

4

Evaluation Context

AMPVPrevious Tests

Capability Gaps

Historical Test Data and Analysis

Variations from future version

M113

Capability Gaps

Historical Test Data and Analysis

Variations from AMPV

Bradley

Capability Gaps

Historical Test Data and Analysis

Variations from AMPV

Use Historical Information to Inform T&E

TaskFor the Armored

Multi-Purpose Vehicle

(AMPV):

• Develop an

Evaluation Strategy

• Test/Fix/Test

• Evaluate

• Analysis and

Recommendations

Page 5: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

5

Categorize into

functional

groups,

review historical

changes

Identify and

categorize

major

replacement

components

Analysis included

reviewing over 18M

miles of field data

and test data

Leveraging Historical Data

Determining Risk Areas Prior to Test

Risk Areas Risk Areas

Test Data Field Data

GOAL: Analyze historical reliability risk

areas observed during system test and

fielding to inform future reliability testing

Page 6: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Improved Analysis Techniques

Wheeled Vehicle TestingImproved analysis

techniques and evaluation

methodologies are leading

to more efficient reliability

testing. 3,000 miles

avoided and $135,000 in

savings

Increased use of vehicle

instrumentation has

saved wheeled vehicle

test program 15,000

miles and $400,000

6

Page 7: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Design of Automotive Reliability

Test and Evaluation (DART)

Full-up System Level Testing (FUSL) alone is inefficient for reliability growth. M&S and compressed FUSL testing

are valuable test tools that should be utilized early and often to accelerate understanding

Align government reliability testing with the systems engineering “V” to include developmental testing of components

and subsystems with defined entrance/exit criteria prior to FUSL testing (crawl – walk – run)

Tailor testing to full range of operational usage, including extreme natural environments (XNET)

Multi-organization working group leveraging insight from industry and academia

to further improve effectiveness and efficiency of automotive reliability T&E

Focus: Cooperative Research & Dev, Compressed T&E Methodology, Automotive Rel T&E Guidance, Contracting, Policy

Materiel SolutionAnalysis

TechnologyDevelopment

Engineering & Manufacturing

Production & Deployment

Post Fielding, ECPs

Robust

subsystem PS&T &

Physics of Failure

Robust PS&T,

Compressed

FUSL, XNET

Design for

Reliability

Best Practices Language

Sys RBDModel

FMECA

Increased Utilization of available data (MLLP, SDC, GCSS-A, Black Box)

FUSL

Operational

Testing

Vision for Reliability T&E in a Complex World (unknowable)

B CATargeted and

Informed Testing

7

Page 8: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Building Tools to Make Better Use of

the Data We Collect

• AEC developed visualization tools to make

better use of large data sets obtained from

data acquisition systems to support

evaluations of system effectiveness,

suitability, and survivability.

•Data sets are mined to:

• Provide additional context and details

of test events.

• Better understand mission impact of

test events.

•Support in-depth root-cause analysis of

events.

•Next Steps: Data integration and

visualization initiative to add geospatial

context to system status.

Instrumentation

Data Sets

are 100 GB

per test day!

Visual “fingerprints” allow

evaluators to see what manual

data collection methods could

not.

8

Page 9: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Meeting the Challenge of

System Complexity

As systems and systems of systems increase in

complexity, new methods are required to ensure

adequate test and evaluation in spite of resource

constraints.

AEC is meeting the challenge of complexity with:

• Advancements in Data Collection,

Reduction, and Analysis (DCRA), including

automated data acquisition systems.

• Scientific Test and Analysis Tools (STAT),

including Design of Experiments.

•Advancements in Empirical Simulation

Methods – especially for understanding

statistical risks at the system of system

level.

AEC utilizing advanced

methodology to ensure

test adequacy when the

test configuration of a

system differs from the

field configuration.

9

Page 10: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Calculating Reliability for Systems

of Systems with Redundancy

iknnn

i

ttinnnttn

ikn

i

ttintn

ikn

i

ttin

ikn

i

ttin

ikn

i

ttin

eeinnni

nnneen

eeini

ne

eeini

n

eeini

n

eeini

ntR

1544

22544335

44

22435

33

32323

22

32322

11

111

0

1)(

544

544)1(

5

0

1)(

4

4

0

1)(

3

3

0

1)(

2

2

0

1)(

1

1

!!

!1

!!

!

!!

!

!!

!

!!

!)(

Failure times

Ideal Distribution

Eliminating unnecessary testing through thoughtful

application of analytical techniques

Traditional exponential distribution assumption

gives MTBF requirement of 683 hrs

Direct analytical calculation gives MTBF

requirement of 226 hrs— Can validate with a

shorter test

Page 11: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Optimizing Resources in a System

of Systems Test Event

Subsystems

A-Kit

B-Kit

C2

Test data collected at a

subsystem level, then

combined using a Block

Diagram approach

Option 2:

2000 hours on A-kits

800 hours on B-kits

1000 hours on C2

Producer risk*: 25%

Option 1:

1500 hours on A-kits

1200 hours on B-kits

2000 hours on C2

Producer risk*: 75%

*Probability system will fail test even if it meets requirements

Using advanced techniques to optimize test

design and minimize risk11

Page 12: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

AnalyzeStatistically to Model

Performance

What Conclusions?

PlanSequentially for Discovery

Goals, Responses and Factors?

Designto Control Risks and

Span Battlespace

How Many? Which Points?

Executeto Control Uncertainty

and Reduce Bias

How to Sequence?

General Factorial3x3x2 design

2-level Factorial23 design

Fractional Factorial23-1 design

Response SurfaceCentral Composite design

Design-Expert® Software

Cm (pitch mom)

Design points above predicted value

Design points below predicted value

0.0611074

-0.0831574

X1 = A: Alpha

X2 = D: Ail

Actual Factors

B: Psi = 0.00

C: Canard = 0.00

E: Rudder = 0.00

12.00

16.00

20.00

24.00

28.00

-30.00

-15.00

0.00

15.00

30.00

0.02

0.02675

0.0335

0.04025

0.047

C

m (

pitch

mo

m)

A: Alpha D: Ail

Pitching

Moment

Angle of

Attack

Aileron

Deflection

Design of Experiments and Statistical

Analysis

TRIAL A B C

2 HI LO LO

8 HI HI HI

5 LO LO HI

4 HI HI LO

1 LO LO LO

3 LO HI LO

7 LO HI HI

6 HI LO HI

12

Page 13: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

System of Systems ExampleSystem X Initial Operational Test with System Y

• Design facilitated side-by-side

comparison between modernized and

legacy aircraft with focus of evaluation

on the additional capability provided

when teaming with unmanned aircraft

systems.

• Primary response variables include

target affiliation range, engagement

range, and mission success.

13

In this sample scenario,

Statistical design verified

that the target acquisition

range for the system of

systems team was

significantly greater than

autonomous capability.

With UAS Without UAS

Aircraft Type Mission Type Day Night Day Night

LegacySystem X

Recon 1 1

Security 1 1 1

Close Combat Attack 1 1 1

Interdiction Attack 1 1

Modernized System Y

Recon 1 1

Security 1 1 1

Close Combat Attack 1 1 1

Interdiction Attack 1 1

Page 14: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Communication System Analysis ExampleOver-the-air Connection Pathways Between Two Nodes

Sample System

At Range 0:

• 68% Terrestrial – One Hop

• 14% Terrestrial – Multi Hop

• 7% Celestial

• 3% No Connection

At Range 2:

• <3% Terrestrial – One Hop

• 18% Terrestrial – Multi Hop

• 60% Celestial

• 14% No Connection

Statistical

Modeling Allowed

AEC to Examine

Connection Path

as a Function of

Range (Sample)

13

Page 15: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

• Statistical modeling & summary statistics

indicated the threat factor had no

significant impact on missed distance

• Program obtained 2-Star approval to

accept analysis of threat in DT and

eliminate threat as factor in OT

Guided Projectile SystemDevelopmental Test Design and Analysis

• 32-shot DOE allowed evaluation of multiple

key performance requirements in single test

• DOE supported several test site constraints

allowing design to be executed in allotted time

Missed

Distance

Level 1 Level 2

(n=19) (n=13)

Mean 22 30

CEP 20 32

DOE implementation & analysis resulted in

~$Ks program cost savings

ICD Requirement Design Factors & Levels

Operate between Minimum and

Maximum Specified Temperatures→ Temp: Low, Med, High

Operate in Hostile Environment → Threat: Level 1, Level 2

Operate with Multiple Fuze Modes → Fuze Mode: A, B, C

Operate between Minimum and

Maximum Specified Ranges→ Range: Low, Med, High

SME suggested

Operate over a Range of Offsets → Offset : Low, High

Operate over a Range of QE → QE: Low, Med, High

Page 16: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Objective Electronic Warfare Testing

Sequence

B

Laboratory Characterization

and Field Test Activities

EW Vulnerability Assessment to inform

operational test design

Operational Testing:

integrating EW into

threat arsenal

Enhanced OT data

reduction/analyses with

M&S

An EW Vulnerability

Assessment is a cooperative

effort using M&S to investigate

scenarios and improve likelihood

of effects on SoS missions

during OT.

Electronic Warfare

integrated into threat

maneuver force during

OT to try to

deny/degrade

Command and Control.

C

OTRR IOT

M&S Development contributes to improved

• Integration of disparate databases;

• Data visualization/playback;

• Link quality/performance analyses;

• Terrain and geometry effects; and

• Expanded evaluation and

understanding. Leverage M&S to inform testing which in

turn will validate simulation

Operational TestDevelopmental Test

Integrated DT/OT

17

Page 17: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Modeling:

Inform, Analyze and Improve

RF Coverage

Connectivity

Playback

Each event provides

• Opportunity to expand evaluations

• Additional data to enhance model

18

Page 18: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Risk Management Framework (RMF)

Objective Cybersecurity Testing Sequence

B

Cooperative

Vulnerability &

Penetration

Assessment (CVPA)

(Step 4)

Operational Testing

with TCNO (Step 5)

Adversarial

Assessment (AA)

A Cooperative Vulnerability

& Penetration Assessment

is an overt effort to identify

vulnerabilities in preparation

for OT.

TCNO emulates

an enemy force

during OT to try to

subvert cyber

protection.

C

OTRR IOT

Developmental Test CVPA and

AA Characterization and Field

Test Activities at significant

Software Updates

Developmental Test

Integrated DT/OT Operational Test

19

Page 19: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

Cybersecurity T&E approach (IAW AR 25-2, DoDI 8510.01, and DASD(DT&E) & DOT&E guidance*) mitigates software and security risks of fielding unproven platform equipment. Applicable data will be leveraged whenever available.

Cybersecurity Test and Evaluation ApproachMajor Software Drops - Example

New Software

Existing EvaluationNew Integration

New Hardware

Su

bsyste

m E

xam

ple

s

- Computing Systems- Improved Displays- New Processor Units- Maneuver Control Enhancements

- Cross Domain Solution Adjustments- Enhanced Training- Improved Vehicle Management- Improved Communications Manager

- CREW Device - Tactical Communication Devices- Battle Command Systems- Power Distribution Systems

OEM Cybersecurity

Testing

Software Drop

Post-OEM Testing

DT Cybersecurity

Testing

Software Drop

Post-DT

OT Cybersecurity

Testing

Software Drop

Post-OT

Software Lifecycle

Maintenance

“Cybersecurity requirements must be identified, tailored appropriately, and included in the acquisition, design, development, developmental and operational testing and evaluation, integration, implementation, operation, upgrade, or replacement of all DoDPlatform Information Technology IAW DoDI 5200.44 and DoDI 5200.39, this instruction, and other cybersecurity-related DoD guidance, as issued.”

(Ref: DoDI 8510.01)

20

Page 20: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

2/5/2015

Visualization of Network

Page 21: U.S. Army Test and Evaluation Command - Home - …...U.S. Army Test and Evaluation Command Mr. David Jimenez Executive Technical Director –Deputy to the Commander ITEA System of

ATEC: Reducing Risk in 2020 through

Test & Evaluation

Questions/Discussion