Using Ontologies to Quantify Attack Surfaces
Mr. Michael Atighetchi,Dr. Borislava Simidchieva,Dr. Fusun Yaman, Raytheon BBN Technologies
Dr. Thomas EskridgeDr. Marco CarvalhoFlorida Institute of Technology
Captain Nicholas PaltzerAir Force Research Laboratory
03.10.10
Distribution Statement “A” (Approved for Public Release, Distribution Unlimited). This material is based upon work supported by the Air Force Research Laboratory under Contract No. FA8750-‐14-‐C-‐0104.
2
Context
• Problem: Defense selection and configuration is a poorly understood, non-quantifiable process– Add defenses that provide little value or even increase the attack surface
– Introduce unacceptable overhead– Cause unintended side effects when combining multiple defenses
Cyber&OperationsOfficer&(COO)
Defenses Systems Missions
OperationalCyber&Missions
Critical&Services
Derivative&Critical&Services
User
Attacker
Use
Use
Require
Dynamic&Cyber&
Defenses
Static&Cyber&
Defenses
MovingTargetDefenses
AttacksStopped
AcceptableGoodput
X
• Objective: Provide tools enabling automated security quantification of distributed systems with a focus on architectural patterns- Model key concepts related to cyber defense
- Provide algorithms to quantify and minimize attack surfaces
- Focus on Moving Target Defense
3
Systematic Quantification of Defense Postures
Unknown&Security&Metrics
Manual&Selection&and&Configuration&of&Cyber&Defenses
NetworkedSystem
Defense
Manual&Trial&&&Error
NetworkedSystem
CyberDefender
Compute&Attack Surface&
Metrics
Automatically&Select&and&Configure&Appropriate&Cyber Defenses
IntelligentlyExecute&
Experiments
Attacks AttacksServer
X
Current State of the Art of Cyber C2 With Reasoning and Characterization
4
Attack Surface Reasoning (ASR)
• Objective: Measure attack surfaces for security quantification– Establish appropriate metrics for quantifying different attack surfaces
– Incorporate mission securityand cost measurements
– Address usability issues throughrepresentative and composite measures of effectiveness
• Technical Achievements– Models for attack surfaces that include systems, defenses, and attack vectors to enable quantitative characterization of attack surfaces
– Metrics for characterizing the attack surface of a dynamic, distributed system at the application, operating system, and network layers
– Algorithms for evaluating the effectiveness of defenses and minimizing attack surfaces
Attack Vectors
DefenseUnifiedModels
Metrics & Algorithms
Mission
Cyber Planning Tool
What-‐If AnalysisExplore
DeploymentInteract
Attack
System
Metric Instances
Adversary
Metric
Compute
Compute
Characterization MinimizationSystem*
Compute
Compute
Mission CriticalSystems & Apps
Moving Target Defenses (MTDs)
GivenSoftwareArtifacts
TransformTransform
5
Modeling Approach
• Express a configuration C as a collection of OWL models– C = {system, defense, attack, adversary, mission, metrics}– Ontology openly available at https://ds.bbn.com/projects/asr.html
• Focus on interactions between distributed components– Adversaries tend to take advantage of weak seems
• Make as few assumptions about adversaries as possible– Minimize “garbage in, garbage out” problems
• Leverage extensible knowledge representation frameworks with powerful query languages– Ontologies expressed in OWL– Models can queried with SPARQL
• Automate model creation when possible– Increase consistency and minimize cost of manual model creation
6
Systems Model• Capture the relevantaspects of systems
• Based on Microsoft’s STRIDE dataflow model– Process
• DLLs, EXEs, service– External Entity
• People, other systems– Data Flow
• Network flow, function call– Data Store
• File• Database
– Trust Boundary• Process boundary• File system
• Extensions– Hierarchical Layering – Inclusion of specific concepts tomake models more understandable
7
Attack Model
STRIDE
S = SpoofingT = TamperingR = RepudiationI = Information DisclosureD = Denial of ServiceE = Elevation of Privilege
CAPAC
Microsoft
MITRECommonAttackPatternEnumeration AndClassification
6
>500
>943MITRE
CWECommonWeaknessEnumeration
Attack TypesExpresses high-levelattack steps
8
Attack Step Model
Example AttackStepDefinition:
9
Current Set of Modeled Attack Steps
10
Adversary Model
• Captures assumptions we make about adversaries– Starting position – Overall objective of the attack
• Quantification experiments assess attack surfaces across many different adversary models
• To increase efficiency of attack vector finding, knowledge of adversarial workflows can be expressed in AttackVectorTemplates
11
Defense Model
• Express the security provided and cost incurred by cyber defenses• Defense models may add new entities to system models (new data flows, processes, etc.)
• Current set of modeled defenses includes three types of MTDs– Time-bounding observables (e.g., IPHopping)– Masquerading (OS Masquerading)– Time-bounding footholds (e.g., continuous restart via Watchdogs)
12
Mission Model• Missions are simply modeled as a subset of data flows together with information security and cost requirements– Security requirements are expressed as Confidentiality, Integrity, Availability– Cost requirements are expressed as %change of latency and throughput
• Missions (and their individual flows) can be in three distinct modes– Pass, degraded, fail
13
Metrics Model
• Attack surface metrics are themselves expressed through a model• Cover {system & mission, security & cost} dimensions
14
Attack Surface Indexes
Aggregate'Security'Index'(ASI)• Attacker(Workload:Minimum(length(of(attack(vectors
• Coverage(over(known(attacks:(Number(of(attack(vectors
• Coverage(over(unknown( attacks:Number(of(entry(points(and(exit(points
• Probabilistic(time@to@fail:(Duration(distributions(of(attack(vectors(and(estimated(probability(of(attack(success
Aggregate'Cost'Index'(ACI)• Latency:Overhead(on(critical(flows
• Throughput:Overhead(on(critical(flows
Aggregate'Mission'Index'(AMI)• Latency(&(Throughput:Resource(use(on(critical(flows
• Confidentiality|Integrity|Availability:Required(security(on(critical(flows
Security Cost
Mission
Integer Integer
Pass|Degraded|Fail
15
Quantification Methodology1. Wrap Defense
System
2. Scan System into Model
3. Characterize Defense
MissionVirtual ExperimentationEnvironment
Networked System*
Networked System
Attack
System
System
System
System
4. Quantify Attack Surface ASI ACI AMI
123 0 Fail
-‐5 +12 Fail
+13 +3 Degraded
+23 +5 Pass
Analytics: • Cost and security metrics • Attack vector finding• Attack surface minimization
Experimentation: • System auto-‐scan• Defense cost characterization• Attack vector validation
5. Validate Attack Vectors
16
Experimental Results
• Manually generated models of tens of hosts and a small number of defenses and attack steps to validate algorithms
• Deployed scanning capabilities on BBN network and virtualized network at customer location and automatically generated system models from live systems
• Explored runtime complexity of attack vector finding and metrics computation algorithms using a random model generator and hundreds of hosts to measure scalability
050100150200250300350400
100 200 300 400 500
Time(sec)
Numberofhosts
AnalysisTime
6a*acksteps
3atacksteps
17
Conclusion and Next Steps
• We created a framework for quantifying attack surfaces using semantic models– Our ontologies are openly available at https://ds.bbn.com/projects/asr.html
– We hope you will try them out and provide feedback!• Next Steps
– Automate defense deployment exploration within a system through a genetic search algorithm
– Include metrics to capture interaction effect between multiple cyber defenses
– Expand scenario to enterprise-scale regimes– Extend the set of modeled cyber defenses beyond MTDs
• Proxy overlay networks, deception, reactive defenses
18
Contacts
• Mr. Michael Atighetchi, [email protected]• Dr. Borislava Simidchieva, [email protected]• Dr. Fusun Yaman, [email protected]• Dr. Thomas Eskridge, [email protected]• Dr. Marco Carvalho, [email protected]• Captain Nicholas Paltzer, [email protected]
Top Related