Post on 03-Aug-2020
© Copyright William Young December 2016 WYOUNG@MIT.EDU
A Systems Approach to Security: Lessons from the Frontlines Applying STPA-Sec
William Young Jr, PhD Annual Computer Security Applications Conference
2016 Los Angeles, CA
December7,2016
© Copyright William Young December 2016 WYOUNG@MIT.EDU
DISCLAIMER:
The views expressed in this presentation are are
those of the presenter and do not reflect the official
policy or position of the United States Air Force,
Department of Defense, Air Combat Command, 53d
Wing, or the U.S. Government
2
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Overview
• Founda:on--SystemTheore:cConceptsAppliedtoSecurityEngineering
• Applica:on–System-Theore:cProcessAnalysisforSecurity(STPA-Sec)
• Illumina:on–ResultsandLessonsLearned
© Copyright William Young December 2016 WYOUNG@MIT.EDU
BoPomLineUpFront:• STPA-Secfacilitatesimproveddefiningofthesecurityproblemandspecifica:onof
preferredapproach
– Producesafunc%onalarchitecturetocomplementthephysicalarchitecture
– ProducesasetofscenariostobePerappreciatehowdesiredfunc:onalitymightbeusedforundesiredoutcomes
• STPA-Seccomplementsexis:ngapproachesandtools
• STPA-SechelpsavoidthemostcommonanalysispiUalls
– OmiPedscenarios
– Incompletescenarios
– Overlysimplescenarios
– Wronglevelofabstrac:on(fordesiredpurpose)
© Copyright William Young December 2016 WYOUNG@MIT.EDU
FOUNDATION
© Copyright William Young December 2016 WYOUNG@MIT.EDU
NeedtoAddressSecurityinConceptPhase
6
Concept Development Production Utilization Retirement
Effe
ctiv
enes
s &
Cos
t to
Fix
Low
High
SecureSystemDevelopment
CyberSecurityBolt-On
APackResponse
?
Problem Analysis Solution Development & Implementation
Problem:HowtoDefinetheRightSecurityProblem?
SystemsEngineeringLifecycle
Ref: (Boehm; INCOSE, 2015; Maier, 2009)
© Copyright William Young December 2016 WYOUNG@MIT.EDU
PrimaryAssociatedStandards
IEEE/IEC/ISO15288• Businessormissionanalysis
• Stakeholderneedsandrequirements
• Systemrequirementsdefini:on
NISTSP800-160• Businessormissionanalysisprocess
• Stakeholderneedsandrequirementsdefini:on
• Systemrequirementsdefini:on
STPA-SecProvidesaRigorousMethodologytoImplementExisLngStandards
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Storyof“Bob”
JustBecauseYouKnowWhatYouWantToBuild,Doesn’tMeanYouHaveDefinedtheProblem
© Copyright William Young December 2016 WYOUNG@MIT.EDU
By now we are all beginning to realize that one of the most intractable problems is that of defining problems (of knowing what distinguishes an observed condition from a desired condition) and of locating problems (finding where in the complex causal networks the trouble really lies). In turn, and equally intractable, is the problem of identifying the actions that might effectively narrow the gap between what-is and what-ought-to-be. ”Dilemmas in a General Theory of Planning.” Horst Rittel and Melvin Webber
Formula:ng(Framing)aWickedProblemistheProblem
© Copyright William Young December 2016 WYOUNG@MIT.EDU
JustBecauseYouCan,Doesn’tMeanYouShould…JustBecauseitWorks,Doesn’tMeanisCanBeSecured
© Copyright William Young December 2016 WYOUNG@MIT.EDU
SecurityToday
• Findthemostimportantcomponentsandprotectthem
• Compliancewithstandardsandbestprac:cewillkeepoursystemssecurefromloss
• Breakingthe“KillChain”preventslosses
Reconnaissance Weaponization Delivery Exploitation Installation C2 Actions
Analysis Detection
Figure 3: Late phase detection
on these tools and infrastructure, defenders force an adversary to change every phase of their intrusion inorder to successfully achieve their goals in subsequent intrusions. In this way, network defenders use thepersistence of adversaries’ intrusions against them to achieve a level of resilience.
Equally as important as thorough analysis of successful compromises is synthesis of unsuccessful intrusions.As defenders collect data on adversaries, they will push detection from the latter phases of the kill chain intoearlier ones. Detection and prevention at pre-compromise phases also necessitates a response. Defendersmust collect as much information on the mitigated intrusion as possible, so that they may synthesize whatmight have happened should future intrusions circumvent the currently e↵ective protections and detections(see Figure 4). For example, if a targeted malicious email is blocked due to re-use of a known indicator,synthesis of the remaining kill chain might reveal a new exploit or backdoor contained therein. Withoutthis knowledge, future intrusions, delivered by di↵erent means, may go undetected. If defenders implementcountermeasures faster than their known adversaries evolve, they maintain a tactical advantage.
Reconnaissance Weaponization Delivery Exploitation Installation C2 Actions
Analysis Detection Synthesis
Figure 4: Earlier phase detection
3.5 Campaign Analysis
At a strategic level, analyzing multiple intrusion kill chains over time will identify commonalities andoverlapping indicators. Figure 5 illustrates how highly-dimensional correlation between two intrusionsthrough multiple kill chain phases can be identified. Through this process, defenders will recognizeand define intrusion campaigns, linking together perhaps years of activity from a particular persistentthreat. The most consistent indicators, the campaigns key indicators, provide centers of gravity fordefenders to prioritize development and use of courses of action. Figure 6 shows how intrusions may havevarying degrees of correlation, but the inflection points where indicators most frequently align identifythese key indicators. These less volatile indicators can be expected to remain consistent, predicting thecharacteristics of future intrusions with greater confidence the more frequently they are observed. Inthis way, an adversary’s persistence becomes a liability which the defender can leverage to strengthen itsposture.
The principle goal of campaign analysis is to determine the patterns and behaviors of the intruders,their tactics, techniques, and procedures (TTP), to detect “how” they operate rather than specifically“what” they do. The defender’s objective is less to positively attribute the identity of the intruders thanto evaluate their capabilities, doctrine, objectives and limitations; intruder attribution, however, maywell be a side product of this level of analysis. As defenders study new intrusion activity, they willeither link it to existing campaigns or perhaps identify a brand new set of behaviors of a theretoforeunknown threat and track it as a new campaign. Defenders can assess their relative defensive posture ona campaign-by-campaign basis, and based on the assessed risk of each, develop strategic courses of actionto cover any gaps.
Another core objective of campaign analysis is to understand the intruders’ intent. To the extentthat defenders can determine technologies or individuals of interest, they can begin to understand theadversarys mission objectives. This necessitates trending intrusions over time to evaluate targetingpatterns and closely examining any data exfiltrated by the intruders. Once again this analysis results
7
Ref: (“Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains” Hutchins et al.)
© Copyright William Young December 2016 WYOUNG@MIT.EDU
CurrentSecurityAnalysis“Whenyouaskanengineertomakeyourboatgofaster,yougetthe
trade-space.Youcangetabiggerenginebutgiveupsomespaceinthe
bunknexttotheengineroom.Youcanchangethehullshape,butthat
willaffectyourdraw.Youcangiveupsomeweight,butthatwillaffect
yourstability.Whenyouaskanengineertomakeyoursystemmore
secure,theypulloutapadandpencilandstartmakinglistsofbolt-on
technology,thentheytellyouhowmuchitisgoingtocost.”
ProfBarryHorowitz,UVA
12Ref:Hamilton et al., 2010)
© Copyright William Young December 2016 WYOUNG@MIT.EDU
STPA-SecProcess
13
DefineandframeproblemIden:fylosses/accidentsIden:fysystemhazards/constraints
Modelfunc:onalcontrolstructureIden:fyunsafe/unsecurecontrolac:ons
Tracehazardouscontrolac:onsusinginforma:onlifecycleIden:fyscenariosleadingtounsafecontrolac:onsIden:fyscenariosleadingtounsecurecontrolac:onsPlacescenariosonD4CharttoIDmorecri:calsecurityscenariosWargamesecurityscenariostoselectcontrolstrategyDevelopnewrequirements,controls,anddesignfeaturestoeliminateormi:gateunsafe/unsecurescenarios
SystemEngineeringFounda%ons
Iden%fyTypesofUnsafe/UnsecureControl
Iden%fyCausesofUnsafe/UnsecureControlandEliminateorControlThem
RED=STPA-SecExtensiononSTPA
© Copyright William Young December 2016 WYOUNG@MIT.EDU
KeyConceptsBehindSTPA-Sec
• Abstrac:onHierarchy
• Constraints
• ProcessModels
© Copyright William Young December 2016 WYOUNG@MIT.EDU
STPA-SecandAbstrac:onHierarchy• Hierarchiesprovideaframework
• Abstrac:oniseffec:veindealingwithcomplexity
• Abstrac:onhierarchyisameaningfulwayto
visualizethedifferencebetweenSTPA-Secand
tradi:onalsecurityapproaches
UseofanAbstracLonHierarchyHasBeenInvaluableinIdenLfyingWhatPeopleMeanby“Security”
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Purpose
AbstractFunc:on
GeneralFunc:on
PhysicalProcess
PhysicalForm
AbstracLon Transporta%onSystem
StocksandFlows
TrafficPaBerns
Car
NetworkRules
CopingwithComplexityThroughAbstrac:on
16
© Copyright William Young December 2016 WYOUNG@MIT.EDU
WholeSystem Hardware Sokware HumanSubsystem1
AggregaLon
Information Hiding Through Aggregation
17
© Copyright William Young December 2016 WYOUNG@MIT.EDU
WholeSystem
SubSystem1
Subsystem2
Component
HW SW Human
Func:onalPurpose
AbstractFunc:on
GeneralFunc:on
PhysicalFunc:on
PhysicalForm
Whole-Part
Ends-Means
Conceptual
Physical
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Undesired System Functionality
TransportaLonSystemorWeaponSystem?
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Aircrakmustmaintainminimumsafesepara:on
Onlyhos:leforcesmustbeengaged
PIImustonlybeexposedtoauthorizeden::es
MidAirCollision FriendlyFireLoss CustomerPIIThek
© Copyright William Young December 2016 WYOUNG@MIT.EDU 21
Controlled Process
Process components interact in direct and indirect ways
Control Actions Feedback
ProcessModel
ControlAlgorithm
• Fourtypesofhazardouscontrolac:ons:
• Failingtoissueacommandgivesrisetohazard• Issuingacommandgivesrisetoahazard• Issuingacommandtooearly,toolategivesrisetoahazard• Stoppingacommandtoosoonorapplyingittoolonggivesrisetoahazard
Controller
© Copyright William Young December 2016 WYOUNG@MIT.EDU
UnacceptableLossesReframedasControlProblems
ProcessModel
ControlAlgorithm
ProcessModel
ControlAlgorithm
ProcessModel
ControlAlgorithm
Aircrakmustmaintainminimumsafesepara:on
Onlyhos:leforcesmustbeengaged
PIImustonlybeexposedtoauthorizeden::es
ENFORCE:SafeSepara:on ENFORCE:EngagementRules ENFORCE:DataAccessPolicy
© Copyright William Young December 2016 WYOUNG@MIT.EDU
WholeSystem
SubSystem1
Subsystem2
Component
HW SW Human
Func:onalPurpose
AbstractFunc:on
GeneralFunc:on
PhysicalFunc:on
PhysicalForm
Whole-Part
Ends-Means
MissionPurpose&
Goal
UnacceptableLosses
Hazards
ControlStructure
Scenarios
© Copyright William Young December 2016 WYOUNG@MIT.EDU
APPLICATION
© Copyright William Young December 2016 WYOUNG@MIT.EDU
High-levelExampleofSTPA-SecApplica:on
25
• Basedonarealworldproofofconceptevalua:onforSTPA-Sec
• GivenaCapabili:esDevelopmentDocument(CDD)forarealU.S.militaryweaponsystem
• Taskedtoiden:fycybervulnerabili:esattheConceptstage
• Systemdiagramsavailable
– CDD,ConceptofEmployment,andmissiondescrip:ons
– DeptofDefenseArchitecturalFramework(DoDAF)Opera:onalViews(OV)1-6
– DoDAFSystemViews(SV)2,4,5,6
• DevelopedSTPA-Secspecificar:facts
© Copyright William Young December 2016 WYOUNG@MIT.EDU
commanders at all levels of warfare through networking and global connectivity.
Figure 3.1 Network-centric Operations9
Conceptually, NCW provides battlespace entities with “shared battlespace awareness” through interconnectivity and networking techniques.10 Warfighters can use a common operational picture (COP) to self-synchronize and increase their situational awareness to reduce the fog and friction of war. The air-to-air mission area is the most thoroughly documented and convincing example of the power of NCW.11 When fighter aircraft are networked, digital information on friendly and hostile forces is shared instantaneously, thereby enabling participants to employ with enhanced awareness. For example, fighter pilots only need to look at their joint tactical information distribution system (JTIDS) display to get an entire assessment of the air battle including who is supporting missiles and who needs assistance. In contrast, non-networked fighters must share information via voice communications with other fighters and controllers. Building a mental picture through voice communication takes much more experience and leads to informational
81
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Tradi:onalSecurityAnalysisResultsofSameWeapon
27
• Weaponpreviouslyassessedusingalterna:vemethodology
• Nospecificrecommenda:onsforcontrols
• Nospecificscenariosgeneratedtoassistengineers
• Evaluatedweaponwithintradi:onalITframework
• Hypothesizednetworkdisrup:onsnotclearly:edtothemostrelevantmissionimpacts
• Examplerecommendedmi:ga:ons
– “SecureDatalinks”
– “Establishopera:onalprocedurestodetectandcountercompromises”
© Copyright William Young December 2016 WYOUNG@MIT.EDU
ExampleProblemStructuring
28
© Copyright William Young December 2016 WYOUNG@MIT.EDU
UnacceptableLosses
29
Unacceptable Losses for NEW
L1: Mission Failure
L2: Violate Rules of Engagement
L3: Loss of Weapon Critical Program Information
!
© Copyright William Young December 2016 WYOUNG@MIT.EDU
SystemHazards
30
System Hazard Related Unacceptable Loss
High-Level Constraint
H1: Fail to Meet Desired Effect on Target
L1 C1: Weapon must Achieve Desired Effect on Target
H2: Loss or Damage to Blue Asset
L1, L2 C2: Blue Assets must not be Damaged or Lost from Stockpile to Employment
H3: Collateral Damage beyond Accepted Level
L2 C3: Weapon must not cause collateral damage beyond accepted level
H4: Weapon Critical Program Information Exposed
L3 C4: Critical Protected Information must not be exposed to unauthorized persons
!
ABORTFuncLonalityHelpsEnforceC3
© Copyright William Young December 2016 WYOUNG@MIT.EDU
FCS
31
WeaponBoundary
© Copyright William Young December 2016 WYOUNG@MIT.EDU
© Copyright William Young December 2016 WYOUNG@MIT.EDU
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Poten:alScenario
• Targetareawasclearwhenweaponwasreleased.Oncetheweaponenterstargetarea,groundcontrollerno:cesthatpoten:alnon-combatantshaveenteredthetargetarea.GroundcontrollerneedstoabortweaponbutlacksthepropercryptotosendABORTcommand.– Tradeoff:encryp:onforsecuretransmissionofABORTcommand,butunencryptedABORTsignalprovidesgreaterconfidencethatweaponcanbeabortedunderbroadersetofcircumstances
34
NoRightAnswer…ButatLeasttheTradespaceisBeaerUnderstood
© Copyright William Young December 2016 WYOUNG@MIT.EDU
ILLUMINATION
© Copyright William Young December 2016 WYOUNG@MIT.EDU
ExamplesofBenefitsfromImprovedProblemDefini:on
• Removingapar:cularfeaturethatcannotbereasonablysecured
• Removingafeaturethatisnotexpectedtoprovidebenefitexceedingcosttosecureit
• NotupgradingtoanewsystemwithincreasedaPacksurface
© Copyright William Young December 2016 WYOUNG@MIT.EDU
STPA-SecIntegra:onwithNISTRiskManagementFramework(DoDExample)
DoDI 8510.01, March 12, 2014
ENCLOSURE 6 28
Figure 3. RMF for IS and PIT Systems
a. Step 1 - Categorize System (1) Categorize the system in accordance with Reference (e) and document the results in the security plan. Categorization of IS and PIT systems is a coordinated effort between the PM/SM, ISO, IO, mission owner(s), ISSM, AO, or their designated representatives. In the categorization process, the IO identifies the potential impact (low, moderate, or high) resulting from loss of confidentiality, integrity, and availability if a security breach occurs. For acquisition programs, this categorization will be documented as a required capability in the initial capabilities document, the capability development document, the capabilities production document, and the cybersecurity strategy within the program protection plan (PPP). Specific guidance on determining the security category for information types and ISs is included in the KS. (2) Describe the system (including system boundary) and document the description in the security plan. (3) Register the system with the DoD Component Cybersecurity Program. See DoD Component implementing policy for detailed procedures for system registration. (4) Assign qualified personnel to RMF roles. The members of the RMF Team are required to meet the suitability and fitness requirements established in DoD 5200.2-R (Reference (y)). RMF Team members must also meet appropriate qualification standards in accordance with Reference (p). RMF team member assignments must be documented in the security plan.
Security Analysis via STPA-SEC
Defines the MSN, relevant losses, and
hazardous system states to be controlled
DefinestherequirementstoguideselecLon&
deconflicLonofNIST“Controls”
ProvidesmissionContexttohelpsengineerandimplementtheselected
NIST“Controls”Providesarubrictoassessthe
actualvsplannedimplementaLonoftheNIST
“Controls”
Provides audit trail and rationale to
support and enable senior leader analysis and decision making
Provides list of leading indicators of system
drifting into insecurity (to be monitored and assessed against
proposed system changes)
© Copyright William Young December 2016 WYOUNG@MIT.EDU
AssessmentResults
38
2
14
6
2
BeforeTraining:AbilitytoAnalyzeMissiontoDetermine
ImpactofParLcularDisrupLons
SomewhatCapable
Capable
VeryCapable
AbsolutelyCapable
1
8
13
3
AderTraining:AbilitytoAnalyzeMissiontoDetermineImpactofParLcularDisrupLons
SomewhatCapable
Capable
VeryCapable
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Self-ReportedAssessmentResults
39
4
14
4
2
BeforeTraining:AbilitytoDevelopMiLgaLonStrategy
SomewhatCapable
Capable
VeryCapable
AbsolutelyCapable
1
10
13
1
AderTraining:AbilitytoDevelopMiLgaLonStrategy
SomewhatCapable
Capable
VeryCapable
AbsolutelyCapable
© Copyright William Young December 2016 WYOUNG@MIT.EDU
SafetyandSecurity• Goalislosspreven:onandriskmanagement
– Sourceisprobablyirrelevantandmaybeunknowable
• Methodisthedevelopmentandengineeringofcontrols– Focusonwhatwehavetheabilitytoaddress,nottheenvironment
• STPA/STPA-Secprovideopportunityforaunifiedandintegratedeffortthroughsharedcontrolstructure!
© Copyright William Young December 2016 WYOUNG@MIT.EDU
Conclusion
• Mustthinkcarefullyaboutdefiningthesecurityproblem
– Perfectlysolvingthewrongsecurityproblemdoesn’treallyhelp
• STPA-Secprovidesameanstoclearlylinksecuritytothebroadermissionorbusinessobjec:ves
• STPA-Secdoesnotreplaceexis:ngsecurityengineeringmethods,butenhancestheireffec:veness
© Copyright William Young December 2016 WYOUNG@MIT.EDU
QUESTIONS??
© Copyright William Young December 2016 WYOUNG@MIT.EDU
ThankYouforYourTime
• Wyoung@MIT.EDU• ForMoreInforma:on,SeemyDisserta:on:“SYSTEMS-THEORETICSECURITYENGINEERINGANALYSIS”(AVAILABLEJAN2017)
• STPA-SecPrac:calGuideComingSoon
© Copyright William Young December 2016 WYOUNG@MIT.EDU
LLNLRiskAssessmentStudyResults
- 3Xasmanyrisksastradi:onalmethodsiden:fiedwithSTPA
- Widerrangeofrisks
- “OutsidetheBox”Risks
© Copyright William Young December 2016 WYOUNG@MIT.EDU
BlindNuclearPowerIndustryStudy
- STPAFoundallhazardsuncoveredbytradi:onalapproaches
- STPAuncoveredaddi:onalhazardsnotfoundbyoldermethods
- STPAiden:fiedonehazardscenariothatledtoactualaccident