Security Testing Automation and Reporting-26May2011-ISSA ......Agenda • Definions •...
Transcript of Security Testing Automation and Reporting-26May2011-ISSA ......Agenda • Definions •...
Security Tes,ng Automa,on and Repor,ng
Adrien de Beaupré Intru-‐Shun.ca Inc.
ISSA-‐O=awa, 26 May 2011
Agenda
• Defini,ons • Methodology • Workflow • Repor,ng • Problems • Solu,ons
Intru-‐Shun.ca Inc. 26/05/2011 2
Defini,ons
• Vulnerability -‐ flaw or weakness in a system that can be exploited.
• Security audit -‐ assess the adequacy of controls and evaluate compliance.
• Vulnerability assessment -‐ descrip,on and analysis of vulnerabili,es in a system.
• Penetra,on tes,ng -‐ circumvent the security features of a system.
Intru-‐Shun.ca Inc. 26/05/2011 3
OSSTMM • The Open Source Security Tes,ng Methodology Manual 3.0 covering security tes,ng, security analysis, opera,onal security metrics, trust analysis, opera,onal trust metrics, and the tac,cs required to define and build the best possible security over Physical, Data Network, Wireless, Telecommunica,ons, and Human channels.
Intru-‐Shun.ca Inc. 26/05/2011 4
Tes,ng
• Every test consists of a s,mulus and response, and monitoring to verify the response, or lack thereof.
• Tes,ng consists of modules. • Each module has an input and an output. • You must monitor closely for responses. • Tes,ng must be appropriate to the target. • Tes,ng is of limited value if nothing is fixed.
Intru-‐Shun.ca Inc. 26/05/2011 5
Test Automa,on
• Automa,on of tes,ng tools is not difficult. • Some tools have their own scheduling and automa,on features.
• Others can use built in OS features like cron and scrip,ng languages.
• The more interes,ng part is ensuring compliance with a methodology and automa,ng parts of analysis and repor,ng.
26/05/2011 Intru-‐Shun.ca Inc. 6
Methodology
• Logis,cs and planning • Open Source Informa,on Gathering • Reconnaissance • Iden,fica,on • Research • Vulnerability Scanning • Valida,on • Repor,ng
Intru-‐Shun.ca Inc. 26/05/2011 7
Open Source Info
• Purpose: gathering informa,on on the target organiza,on, typically from the Internet.
• Inputs: organiza,on name, URL, IP addresses or ranges, industry or organiza,on type.
• Outputs: URLs, IP addresses or ranges, email addresses, ‘buzz’, technologies used, resumes, names, Host names.
• Data types: free form text, graphics, sta,s,cs.
Intru-‐Shun.ca Inc. 26/05/2011 8
Reconnaissance
• Purpose: determine which systems are live and map the network.
• Inputs: URLs, IP addresses or ranges. • Outputs: Whois, DNS, IP addresses or host names of systems which are likely to be live.
• Tools: Ping, Nmap, Ike-‐scan, Fierce Doman Scanner, traceroute, ICMP…
• Data types: text files, XML files.
Intru-‐Shun.ca Inc. 26/05/2011 9
Iden,fica,on
• Purpose: enumerate the systems that are live, determine open ports, listening services, applica,ons, opera,ng systems, and versions.
• Inputs: systems known to be live. • Outputs: ports, services, OS, versions, patches. • Tools: Nmap, Amap, Ike-‐scan, Nessus… • Data types: text files, XML files.
Intru-‐Shun.ca Inc. 26/05/2011 10
Research
• Purpose: list all poten,al vulnerabili,es. • Inputs: technologies in use. • Outputs: list of poten,al vulnerabili,es. • Tools: vulnerability databases, search engines…
• Data types: text files, XML files, databases.
Intru-‐Shun.ca Inc. 26/05/2011 11
Vulnerability Scanning
• Purpose: iden,fy known or unknown vulnerabili,es in the iden,fied technologies.
• Inputs: IP addresses, ports, services, applica,ons.
• Outputs: lis,ng of poten,al vulnerabili,es. • Tools: scanners such as Nessus, NexPose, Re,na, Acune,x, Burp, W3AF…
• Data types: text files, XML files, databases…
Intru-‐Shun.ca Inc. 26/05/2011 12
Valida,on
• Purpose: assign a confidence value and validate poten,al vulnerabili,es.
• Inputs: lis,ng of all poten,al vulnerabili,es. • Outputs: lis,ng of validated vulnerabili,es and confidence ra,ng values.
• Tools: penetra,on tes,ng (Metasploit, Core Impact, Canvas…), scripts, manual valida,on.
• Outputs: text files, graphics, XML files, database entries, databases.
Intru-‐Shun.ca Inc. 26/05/2011 13
Repor,ng
• Purpose: assign risk and priority ra,ngs to confirmed vulnerabili,es.
• Inputs: list of validated vulnerabili,es. • Outputs: analysis results. • Tools: people brain power. • Outputs: text files, database entries, documents.
• Wordsmithing.
Intru-‐Shun.ca Inc. 26/05/2011 14
Why Automate?
• Laziness J. • Consistent results over ,me. • Allows for scheduling and trending. • Streamlined and more efficient. • Engineering a process that can be run and maintained by an opera,onal group.
• Allows the test team to concentrate on the areas that are not automated.
Intru-‐Shun.ca Inc. 26/05/2011 15
Requirements
• Process – follow consistent repeatable methodology • Scriptable – typically Linux CLI tools • Tool – result that can be parsed • Database – for correla,on and repor,ng • Correlated – mul,ple sources of data • Analyzed – intelligent human analysis • Mi,ga,on – how to respond, recommenda,ons • Metrics – quan,ta,ve, measurable, trends • Severity – ra,ng system
Intru-‐Shun.ca Inc. 26/05/2011 16
Workflow
• Methodology is broken down into modules. • Output from one is the input to the next. • Unfortunately most tools do not follow the methodology flow precisely, or may not allow for data extrac,on between modules.
• Which means that either we must run each tool mul,ple ,mes with different configura,ons, or different tools for each module.
Intru-‐Shun.ca Inc. 26/05/2011 17
Workflow
• Output from module > database import • Database queries > inputs to next module • Repor,ng module > ,cke,ng • Tickets > vulnerability management and mi,ga,on
• Close the loop back to the test team process • Re-‐test where necessary
Intru-‐Shun.ca Inc. 26/05/2011 18
Problem
• Individual tools do not always follow a methodology and do not always allow for sufficiently granular control.
• No one tool can perform all modules. • Methodology requires use of mul,ple tools. • Each tool may have a different output format or use a proprietary database.
• Correla,on and analysis can be ,me consuming.
Intru-‐Shun.ca Inc. 26/05/2011 19
What is Missing
• Security Assessments collect a lot of data, but don’t correlate the data.
• To properly iden,fy risk and threats, correla,on of collected data is necessary.
• Qualita,ve scales mapped to controls. • Quan,ta,ve scales to cost of control. • Current systems – Extremely Expensive.
Intru-‐Shun.ca Inc. 26/05/2011 20
Solu,ons
• Single unified and normalized database schema for all security assessment tools.
• Obviously requires that such a schema exist! • Requires a parser for each tool we use. • This allows us to create an abstract layer between the tools and the common database, while s,ll allowing us to enforce the methodology regardless of the tools used.
Intru-‐Shun.ca Inc. 26/05/2011 21
OSSAMS • Open Source Security Assessment Management System, s,ll in development.
• A framework for security assessors to correlate and analyze risk to systems.
• Streamlines the assessment repor,ng process. • Build on past assessments, trends, and stats. h=p://www.ossams.com/ Adrien de Beaupre, Cody Dumont, and Darryl Williams.
Intru-‐Shun.ca Inc. 26/05/2011 22
Database Design
• One of the key aspects of OSSAMS is the normalized database design.
• It is capable of having any number of tool outputs as an input.
• Currently modeled using MySQL on Linux with Python or Perl scripts to parse outputs.
• A front-‐end will be designed to move away from CLI.
• It is flexible, extensible, and Open Source. Intru-‐Shun.ca Inc. 26/05/2011 23
Intru-‐Shun.ca Inc. 26/05/2011 24
Conclusions
• The key is not running the scanners, but analysis, correla,on, documenta,on and problem solving.
• Organiza,ons can automate security tes,ng and repor,ng processes.
• The key to automa,on is results parsing and database u,liza,on.
• These can be built using Free / Open Source Soqware tools and/or commercial offerings.
• Should be done with proper planning, tools, methodology, processes, and exper,se.
Intru-‐Shun.ca Inc. 26/05/2011 25
QUESTIONS?
ADRIEN@INTRU-‐SHUN.CA
Intru-‐Shun.ca Inc. 26/05/2011 26
THANK YOU!
Intru-‐Shun.ca Inc. 26/05/2011 27