Dartmouth Cyber-Security Initiative and the Achilles Vulnerability Assessment Console:
A Case Study in Collaboration
Dartmouth College
EDUCAUSE Security Professionals Conference
5/5/2008
Copyright Dartmouth CSI - 2008
Speakers
• David Bucciero – Dir. Technical Services, Peter Kiewit Computing Services, Dartmouth College
• Cory Cornelius – Research Engineer,Institute for Security Technology Studies, Dartmouth College
• Adam Goldstein – IT Security Engineer, Peter Kiewit Computing Services, Dartmouth College
• Chris Masone - PhD Candidate, Computer Science Department, Dartmouth College
• Scott Rea – Senior PKI Architect, Peter Kiewit Computing Services, Dartmouth College
Agenda
• Establishing Dartmouth CSI• Phase 1 – Campus-wide Vulnerability Assessment• Achilles: Vulnerability Analysis Console• CSI involvement in operational projects
– Secure Wireless Architecture– Incident Response– SecureLinux
• Current Projects• Achilles Demo
Collaborative Effort
All departments have a stake in IT security.
Therefore, multiple viewpoints should be included in the discussion and planning of a security program.
Dartmouth Cyber Security Initiative serves as an example
– Joint effort by faculty, staff and students
– Work together to plan, assess, and implement IT security solutions.
CSI Overview
Motivation
• Constant system and infrastructure improvement process
• Cost prohibitive for outside vendor to provide annual security assessment
• Partially in response to the 2004 Security Breach
CSI Overview - Continued
Began in Summer of 2006• Executive IT Security Committee Formed
– PKCS staff and faculty from Computer Science and Thayer School of Engineering
• Bought into:– Private Addressing, so machine can’t be scanned from the Internet– Intrusion Detection/Prevention– Application Authentication with eTokens and CAS based
WebAuthentication– Secure Document Server so we have a place to put sensitive data
• Determined that campus-wide assessment was next step in securing Dartmouth systems
First Steps
• Assembled team of student interns to work with a security consultant and PKCS staff to perform the assessment
• Legal counsel engaged to draft non-disclosure agreements and advise on strategy and potential exposures (recommended background check)
• Office of Risk Management engaged to advise on strategy and potential exposures
• Executive committee provides focus for assessment priorities and triage of issues encountered during process
• Security consultant begins assessment, training students in an “on-the-job” manner for an assessment on a concise and critical portion of the network, assisting the students to repeat the process on other portions of the network, and then generalizing and extrapolating the process to include the entire network
Establishing Student Team
• Broad range of expertise tapped from mostly technically oriented programs:
– Computer Science (PhD, Masters, Undergrads)
– Engineering (PhD, Masters)
• Each potential participant must first agree to (and sign) a non-disclosure document, and submit the data necessary to have a background check completed
• Oversight is provided by Peter Kiewit Computing Services
• Weekly team meetings are used to assign tasks, report on progress, discuss issues, and collaborate on resolution tasks
• Mentoring is facilitated via participation of faculty and post doctoral researchers
– Initially entirely on a volunteer basis
Establishing Student Team
• Recruitment of student participants is entirely word of mouth
• Natural selection process used to ensure sufficiently capable participants
• Some student representatives chosen to participate in Executive IT Security Committee
• Remuneration arrangements:– Post graduates receive contributions/relief/support to their already
established stipends (enabling more flexibility and capability in their educational experience)
– Undergraduates are actually engaged at an hourly rate on a part-time basis
– Everyone gains valuable education and hands on, real world experience, working with a large complex infrastructure
Participation from the Student Perspective
• The “Ivory Tower” vs. Real Life• Use tools on REAL systems!• Impact actual, deployed technology• Research?
The “Ivory Tower” vs. Real Life
• Academics often ignore real-world constraints– Exposure could be enlightening
• Often stuck with toy/canned systems– Chance to gather data on a wide-scale– Tech no one thought to look at before?
• May have a better view of what’s coming– New classes of attacks?– Innovative, applicable new solutions?
Use tools on REAL systems
• Gain experience with tools– Nessus, nmap, arpsk, vomit (later)
• Discover real, vulnerable systems• BREAK IN!!!!!
– With permission, of course
• What CAN one do after 0wning something?– Manual exploration, pivoting, impersonate the
gateway, reconstruct print jobs, etc.
Impact actual, deployed tech
• We could help make Dartmouth better!– Find holes, get them fixed, provide ammo
• We could have a voice in IT decisions!– Get our needs heard, smooth transitions
• We could SEE what drives real-life choices!– Factors other than “The Perfect Solution?”
Research?
• Not really for a PhD– Excellent team-leading experience
• GREAT for undergrads, perhaps MS– Huge bodies of data for measurement– “Experiences” papers
• Developing lots of marketable skills!– Got me an internship offer– NSA, Cisco, etc.
Phase 1 of CSI:Campus-wide Assessment
• Perform a campus-wide vulnerability assessment that consisted of: • Authentication system review and testing• Wireless and wired network testing• Network-based vulnerability scan of all hosts• Wardialing of phone/fax/modem network
• Develop tools and procedures for performing future assessments
• Establish a remediation process for addressing identified vulnerabilities
• Provide both undergraduate and graduate students with valuable hands-on vulnerability assessment and penetration testing experience
Phase 1- Vulnerability Assessment Scanning
• Scanned all active hosts on Dartmouth’s network for known vulnerabilities – January 2007
• 233 Subnets and 8,771 hosts were assessed
• Student team developed a web-based console (Achilles) to analyze findings
• Scan tools and process documented for future assessments
• Automated report generation for help desk consultants and system administrators
Vulnerability Scanning Process
1. Host Discovery (NMAP)
2. Port Scanning and Service Identification (NMAP)
3. Automated Vulnerability Scanning (Nessus)
4. Analyze Scan Findings (Achilles – Developed by students)
5. Report Results
Vulnerability Scan Analysis
• Developed Vulnerability Remediation Matrix to determine Dartmouth-specific severity
• Considered location, vulnerability type, CVSS score. (“Common Vulnerability Scoring System” http://www.first.org/cvss/
• Achilles database and web console used to organize data and make analysis more efficient
Phase 1 -Vulnerability Scan Results
• 600+ systems were identified with vulnerabilities requiring remediation
• Most common reasons included:– Out-of date or un-patched software– Improper configurations– Embedded devices with security issues
Achilles: Vulnerability Analysis Console
Why Achilles?– Nessus output for 8000+ hosts was too difficult
to manage– Existing reporting tools were aimed at small
subnets or host by host analysis
Achilles - Goals
• Standard interface for vulnerability reports• Web-based console for analysts and system
administrators to access reports• Ability to update multiple records at once• Sort and group findings by date, location,
plugin, etc.• Rank results based on previous analysis and
Dartmouth-specific criteria
Achilles - Development
• Two person team initially, now maintained by one person
• Achilles was developed using an “Agile” style of development
• We immediately deployed Achilles and requested feedback
• Feedback was used to drive the development of Achilles
Achilles - Features
• Ruby on Rails backed by MySQL• Achilles’ primary functionality is
prioritizing vulnerabilities• Prioritization is accomplished through a
Ranking Matrix• Also allows one to schedule and keep track
of Nessus scans• Everything is dynamic
Achilles- Ranking Matrix
• Vulnerabilities assigned a default severity on 1-5 scale
• Criteria for default severity ranking:– Nessus Plugin Family– Nessus Risk Factor and CVSS – Vulnerability Type– Network Location
Achilles – Operational Procedures
• IT, administrative, and critical academic subnets scanned monthly
• All subnets scanned quarterly• IT Security Engineer takes first pass at
findings and makes edits as needed• System administrators and User Services
consultants then check their systems
Achilles – Next Steps
• Host history• Search• User preferences• Auditing• Reporting
Achilles – Where to get it
• Achilles 2.0 project on GitHub:
http://github.com/dxoigmn/achilles/
Phase 1 - Additional Testing
• Wardialing Tests• Wireless Sniffing• Man-in-the-Middle Tests• Review of New Technologies
Phase 1 – Additional Tests
Interesting Findings:• Wardialing Tests
– 1600 numbers tested– 2 vulnerable modems discovered
• Wireless Sniffing– Lots of PII data flying by– Passwords in the clear
Phase 1 - Additional Testing
Interesting findings cont…• Man-in-the-Middle Tests
– Can we fool folks even if PKI is used to secure host services?
• Assumption that folks will click anything if they think they will end up where they want to go
• Little bit of social engineering employed
• Leverage open wireless network to trivially set up infrastructure
• Targeted locations
– Yes we can
• Justifications for an authenticated network
Phase 1 - Additional Testing
• Review of New Technologies
– Take a proactive approach to security by evaluating technology BEFORE it is deployed
– Develop a set of guidelines for evaluating technology against
– Educate system owners on the benefits of taking this approach
• Examples
– Set-top boxes
– Enterprise client software
– Wireless vending machines
CSI involvement in operational projects:
New Wireless Security Architecture
• CSI Team assisted in developing, testing, and deploying new wireless security architecture
• EAP-TLS/WPA2 selected for 802.1x authentication and
encryption– Most secure method– Leverages Dartmouth’s PKI– Will support 2-factor eTokens
New Wireless Security Architecture (cont.)
• CSI assisted in overcoming challenges of new architecture– Long established “open” network– Client configuration– Legacy/Headless devices
• Cooperative approach to cutover• Fall 2007: in production, Feb 2008:
required• 60% adoption in first week with minimal
support issues
Incident Response Team
• CSI hosted 2 incident response workshops• Created incident response team• Assisted in development of detailed “ground-up”
procedures• Wrote Communication Plan now being leveraged
by other groups in IT
CSI- SecureLinux Project
• Stripped down Linux install– Minimal packages– Kickstart file
• Hardening Scripts– For new installs and existing systems– Undo available– Run on system, from web, or over ssh– Attempt to be distro-independent
CSI- Securelinux Scripts
• Shut down listening services• Firewall setup• Configure ssh and sysctl• Login banner• Enable selinux• Set up Dartmouth root cert and proxy• Patch system• Suggestions?
Current Projects
• Web Application Auditing:– Focus on application security and data protection– Security review of critical applications – Identify and review application servers with critical
data unknown to PKCS – Develop process for testing new applications against
OWASP “Top Ten” • VOIP Security• User awareness
– Kiosk• Security metrics
Thanks!
Questions or comments:
Also:
Securing the eCampus 2008
Nov. 11-12, 2008
Dartmouth College, Hanover, NH
Copyright 2008 Trustees of Dartmouth College
Thanks!
Copyright Dartmouth CSI – 2008. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.
Top Related