It’s just security. Jack Whitsitt | [email protected] | gmail.com I am NOT representing.
-
date post
20-Dec-2015 -
Category
Documents
-
view
216 -
download
1
Transcript of It’s just security. Jack Whitsitt | [email protected] | gmail.com I am NOT representing.
Jack Whitsitt | [email protected] | http://twitter.com/sintixerr
I am NOT representing my employer in any way, shape, or form
I’m not a critical energy sector expert in particular Why am I here then? Started writing talk by answering panel questions
◦ Got stuck on question 1
About Me
I have no idea what they are – don’t really care◦ This is where I got stuck!
But I’ve seen instead:◦ Phishing◦ USB drives◦ Common Development Errors◦ Change Management Screw-ups◦ Lack of visibility
Energy uses COTS and GUI systems for control◦ Why would bad guys burn something dedicated when
they can use common stuff?◦ Maybe a pertinent answer is a question: Why can they
still use common stuff?
What are the main SCADA vulnerabilities for energy?
The oil and gas industry breaches….Marathon Oil, ExxonMobil, and ConocoPhillips – occurred in 2008, until the FBI alerted them that year and in early 2009
“We’ve seen real, targeted attacks on our C-level [most senior] executives,” says one oil company official…
Penetrated their electronic defenses using a combination of fake e-mails and customized spyware programs
Antivirus software misses more than 20 percent of the Trojans in my testing,”
“What I’m saying to you is that it’s not just the oil and gas industry that’s vulnerable to this kind of attack: It’s any industry that the Chinese decide they want to take a look at,” says an FBI source. “It’s like they’re just going down the street picking out what they want to have.”
Christian Science MonitorJanuary 2010
We are doing things over and over again we know we shouldn’t
Examples:◦ WEP device attached to vendor network. ◦ Previously unknown networks or connections to the internet – not in
architecture.◦ Password-less Smart Meters found in a search engine. Whoops. ◦ Lack of human awareness: “Let me click that link”
These aren’t even “cyber security” specific failures But they’re what the bad guys use None should have happened: Errors made at a high, largely
uncontrolled rate◦ Everyone makes them
Root Vulnerability: Error Rate
Infinite Trust Chains and No Perimeters Examples:
◦ HMI hardware out of box. Host file was already compromised
◦ Embedded Web Server vulnerability in HMI gear◦ No responsibility or authority, made worse by
support models
Problem Compounded: Trust Chains
Attack Surface Increasing At a MINIMUM because of increasing interconnections Even without new technology
Tactical response won’t help: Not fixing one vulnerability Not fixing ten vulnerabilities Not fixing a thousand SCADA vulnerabilities
Must slow the flow, reduce error rate Cant keep up if we don’t: We don’t have the resources Already can’t: Compromise at-will
Key will be Language and Communication & Awareness Currently, we cant even consistently discuss goals in term
of common safety and operational and business priorities much less derive strategic solutions
Unchanged Error Rate + Increasing Attack Surface = Strategic Loss
Architecture diagrams are never true. Ever. ◦ If you want to know where your vulnerabilities
are, look for where your reality is different from your expectation
◦ This might not be a manually maintainable process; Possible subject for research
Cyber Security efforts without solid change control and management is like asking an ancient Roman God for rain. It’s not science, it’s faith
Number one failure of cyber security
First Principles: What do you have?
Now that you know what you have…what exactly are you DOING?◦ “Securing the infrastructure” not good enough – it
doesn’t mean anything Need an “Algebra of security” that
◦ Allows consistent comparable expressions of goals◦ Assures line of sight between strategic risks FROM
cyber systems and tactical risks TO cyber systems Until then, we’re talking at each other, not to
each other, and hoping to get lucky
First Principles: Define what you want
Use the algebra to create energy-specific definitions of success◦ What do we mean by secure energy infrastructure?
Techies cant answer this for you Create a definition that can be consistently understood
across all players Separate out priority valuation of goals and commonly
understood goals◦ If you cant answer that question, how can you talk
about how to build it?◦ If you cant answer that question and compare it to
what you have to find gaps, how do you know where to start?
First Principles: Define what you want
Based partially on Sandia Incident Classification Model: Http://www.cert.org/research/taxonomy_988667.pdf
Based partially on SABSA Enterprise Security Architecture model
Uses Business Threat Trees to◦ Define strategic cyber security requirements for long term
planning◦ Identify Tactical technical issues that impact long term
objectives◦ Allow independent parties to use same language to express
cyber security, even with different priority levels◦ Create framework which security service architecture can be
validated
Example Framework
Cede the network◦ At least in terms of using network level controls as the first
means of data/access/action control at the application layer◦ Putting a box around it is not, and will never be granular enough◦ Can’t do it anyway, it’s really, really big. This is a last resort◦ Next steps of research: Small unit test cases from data/behavior
transition from one step to the next Focus on Gracefully Handling Compromise
◦ If we assume we’ve lost already and defense might be too expensive, are there alternatives?
◦ We all live with bacteria inside of us, can the energy infrastructure?
Don’t throw good money after bad◦ Antivirus, Firewalls, IPS’s, and patching have failed IT, don’t
blindly invest in them
Last Advice: Play Like You’ve Lost