Research Focused Cyber Evaluation
Mr. Douglas E. Stetson
29 May 2014
MPC Eval - 2DES 05/29/2014
Cyber Assessment
• The process of characterizing a system that has computer components using repeatable methods for a specific purpose
Assessment occurs throughout the technology lifecycle, from conception to disposal
Purposes include:
• Verification • Validation • Exploitation• Mitigation
Methods include:
• Analysis • Modeling • Simulation• Range testing• Experimentation
Characterization may focus on:
• Capabilities • Security • Risk
MPC Eval - 3DES 05/29/2014
Cyber Assessment
• The process of characterizing a system that has computer components using repeatable methods for a specific purpose
Assessment occurs throughout the technology lifecycle, from conception to disposal
Purposes include:
• Verification • Validation • Exploitation• Mitigation
Methods include:
• Analysis • Modeling • Simulation• Range testing• Experimentation
Characterization may focus on:
• Capabilities • Security • Risk
MPC Eval - 4DES 05/29/2014
MPC Eval - 5DES 05/29/2014
Goals of an Evaluation
• Help the program manager understand the progress and the capabilities of the performers
• Provide the performers with an independent assessment
• Push the clarification of the goals of the program
• Provide empirical results of the success of the program
• Find the limits of the researched technologies
MPC Eval - 6DES 05/29/2014
AnalysisModeling & Simulation
Cyber RangePrototype
Deployment
Fidelity Low Low Moderate to High High
Scalability High High Moderate Low
Cost Low Low Moderate High
Repeatability N/A High Moderate to High Low
Program Phase Early Early Mid-term Mid-term to Late
Modeling & Simulation Cyber Range Prototype Deployment
Approaches to Cyber Assessment
Analysis
There are a number of approaches to cyber assessment, each with different strengths that align with a range of assessment objectives
MPC Eval - 7DES 05/29/2014
Metrics
• Metrics vs. success criteria:– Metric: How long did it take?– Success Criteria: Was it less than 10 seconds?
• Ideally, where you should be at the end of a research program
• Measures of Performance (MoP) give direct, quantitative evaluation against specific success criteria based on– State of the art baseline– Previous baseline– Intended use requirements– Arbitrary
• Measures of Effectiveness (MoE) ideally give direct, quantitative evaluation of how useful the technology is– May be qualitative– May be impossible in early research
MPC Eval - 8DES 05/29/2014
Metrics
• Metrics may be multidimensional– i.e. efficiency gain vs. analysis time
vs. security loss
• Success criteria provide a goal
• The best performers build great systems, mediocre performers focus on the metrics and success criteria– Metrics should enhance, not
inhibit, research– Best practice is to refine them as
the program progresses
Eff
icie
ncy
Gai
n
TimeSecurity
MPC Eval - 9DES 05/29/2014
10 TB Database
1K Rows
10 GB Data
…
10 GB Data
100M Rows
100 KB Data
……
100 KB Data
10B Rows
1 KB Data
……………
1 KB Data
MPC Eval - 10DES 05/29/2014
Key Performer Roles for Effective T&E
Communicate goals clearly to the evaluation team
1
Read, understand, and provide feedback on the test plan
2
Participate in the creation and adoption of the program metrics
3
Actively participate in pre-evaluation exercises, which ensure a smooth
evaluation
4
MPC Eval - 11DES 05/29/2014
Bounding the Evaluation
• Use cases– Help with understanding portions of the problem– Help with early focus– Help with defining early evaluations
• Limitations– Can impede focus on universe of problems– Can impede generalization– Will not limit later evaluations
MPC Eval - 12DES 05/29/2014
Bounding the Evaluation
• Multiple ways to construct the program that drive the evaluation style– Common performer goals, common evaluation– Performers solve pieces of the problem, evaluate each piece– Performers solve pieces of the problem, evaluate integration
as a whole– A combination of the above– ?
MPC Eval - 13DES 05/29/2014
Summary
• Evaluation helps to focus the work by providing discernable goals
• It should be comprehensive enough to explain the capability without limiting the research
• Metrics will be the most important discussion for ensuring the evaluation can successfully improve the program
Top Related