Post on 15-Jan-2016
1
Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Carla Brodley, Sonia Fahmy, Cristina Nita-Rotaru, Catherine RosenbergCatherine Rosenberg
Current StudentsCurrent Students: Roman Chertov, Yu-Chun Mao, : Roman Chertov, Yu-Chun Mao, Kevin RobbinsKevin Robbins
Undergraduate StudentUndergraduate Student: Christopher Kanich: Christopher Kanich
June 9June 9thth, 2004, 2004
DDoS Experiments with Third DDoS Experiments with Third Party Security MechanismsParty Security Mechanisms
2
Year 1 Objectives Understand the testing requirements of different types of
detection and defense mechanisms: We focus on network-based third party mechanisms
Design, integrate, and deploy a methodology for performing realistic and reproducible DDoS experiments: Tools to configure traffic and attacks Tools for automation of experiments, measurements, and effective
visualization of results Integration of multiple software components built by others
Gain insight into the phenomenology of attacks including their first-order and their second-order effects, and impact on detection mechanisms
3
Year 1 Accomplishments
Designed and implemented experimental tools (to be demoed): Automated measurement tools, and routing/security mechanism log
processing tools, and graph plotting tools Automated configuration of interactive and replayed background
traffic, routing, attacks, and measurements Scriptable event system to control and synchronize events at multiple
nodes
Installed and configured the following software: Quagga/Zebra, WebStone, ManHunt, Sentivist
Performed experiments and obtained preliminary results Generated requirements for DETER to easily support the
testing of third party products
4
Why Third Party Products?
1. No Insider Information: we do not control or understand the internals of mechanisms, therefore we cannot customize tests.
2. Vendor Neutrality: we have no incentive to design experiments for either success/failure.
3. Requirements for DETER: third party tools were not designed for DETER; therefore, we can uncover setup and implementation challenges for DETER.
4. User Perspective: understanding the effectiveness of popular tools to defend against attacks will benefit many user communities.
Selected mechanisms: Symantec ManHunt v3.0 and Network Flight Recorder (NFR) Sentivist.
5
Why ManHunt and Sentivist?
Provide DDoS detection and response Use coordinated distributed detection sensors
We only test the single sensor configuration now Available in a software-only form that runs on RedHat
Linux. In contrast, many commercial solutions are available only
as hardware boxes (e.g., Mazu Networks Enforcer), and some require Microsoft Windows XP, which makes remotely experimenting with them difficult on the current DETER testbed.
Obtained both ManHunt and Sentivist at no cost. Mechanisms serve as proof-of-concept for:
Experimental methodology and tools. Identifying DETER testbed requirements for testing
third-party commercial mechanisms.
6
Symantec ManHunt Claims
``Use protocol anomaly detection and traffic monitoring to detect DDoS attacks, including zero-day attacks.’’
``Provide session termination, traceback capabilities using “FlowChaser,” QoS filters, and handoff responses across domains for DDoS protection.’’
``Provide the ability to coordinate distributed detection sensors.’’
``Detection at up to 2 gigabits per second traffic.’’ ``Identifies unknown attacks via analysis engine.’’
Currently, we only focus on ManHunt detection capabilities
http://enterprisesecurity.symantec.com/products/products.cfm?ProductID=156&EID=0
7
Attacks Studied Tools like Stracheldraht, TFN, Trinoo only should be sanitized first to
ensure that they will not attempt to contact daemons outside the testbed
We experiment with a few recently published attacks: Tunable randomization of Src and Dst [A. Hussain, J.
Heidemann, and C. Papadopoulos. A framework for classifying denial of service attacks. SIGCOMM 2003]
UDP constant/square wave flooding [A. Kuzmanovic and E. W. Knightly. Low-rate targeted denial of service attacks. SIGCOMM 2003]
RST reflection (response to unsolicited ACKs)
ICMP echo request reflection ICMP echo flooding SYN flooding with variable rates
8
Experimental Goals1. Identify challenges associated with testing third party
products on DETER
2. Identify impact of different attack parameters on application-level and network-level metrics
3. Identify impact of the selection of traffic to train an anomaly detection mechanism on false alarms
How? Our experiments vary: The mix of attacks Attack parameters, e.g., on and off periods Background traffic during the training and testing phases Security mechanisms: ManHunt, and Sentivist
Our current victim is an Apache web server and a subset of its clients
9
Experimental Setup
Topology: generated by GT-ITM [Calvert/Zegura, 1996] and adapted to DETER by observing: Limit of 4 on router degree
Cannot employ power law (cd-), small world topologies Delays and bandwidths consume nodes
Quagga/Zebra [http://www.quagga.net/]: introduces BGP routers that generate dynamic routing traffic
WebStone [http://mindcraft.com/webstone]: creates interactive WWW traffic with 40 clients at 5 sites File sizes: 500 B, 5 kB, 50 kB, 500 kB, 5 MB with decreasing
request frequency Replayed NZIX traffic from 2 hosts mapped to all hosts
[http://pma.nlanr.net/Traces/long/nzix2.html]
10
Topology
11
Square Wave Experiment
Varies: Square wave attack burst length l Number/location of attacker(s), attack period T, and rate R were
also varied, but results not reported here Objectives:
Understand attack effectiveness Identify attack effects on routing Identify attack effects on application-level and network-level
metrics at multiple nodes Identify when a mechanism starts identifying attacks
T-l
ll
Time
Rate
R
12
Impact on Throughput
13
Impact on Routing
2004/06/05 14:24:26 BGP: 10.1.39.3 [Error] bgp_read_packet error: Connection reset by peer2004/06/05 14:24:43 BGP: 10.1.44.3 sending KEEPALIVE2004/06/05 14:24:43 BGP: 10.1.44.3 KEEPALIVE rcvd2004/06/05 14:25:43 BGP: 10.1.44.3 sending KEEPALIVE2004/06/05 14:25:43 BGP: 10.1.44.3 KEEPALIVE rcvd2004/06/05 14:25:50 BGP: 10.1.44.3 rcvd UPDATE w/ attr: nexthop 0.0.0.0 2004/06/05 14:25:50 BGP: 10.1.44.3 rcvd UPDATE about 10.0.254.0/24 -- withdrawn2004/06/05 14:26:29 BGP: 10.1.44.3 rcvd UPDATE w/ attr: nexthop 10.1.24.22004/06/05 14:26:29 BGP: 10.1.44.3 rcvd 10.0.254.0/24
14
Aggregate Packet StatisticsUDP 20ms
82%
9%9%
TCP
UDP
ICMP
UDP 80ms
64%18%
18%
TCP
UDP
ICMP
UDP Flood (T-l=0)
44%
28%
28%
TCP
UDP
ICMP
No attack (l=0)
100%
0%
0%
TCP
UDP
ICMP
15
Agg. Application-level Metrics
Average Respone Time
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0 20 40 60 80
Burst Length (ms)
Aver
age
Resp
onse
Tim
e (s
ec)
Total Number of Pages Read
0
50000
100000
150000
200000
250000
300000
350000
0 20 40 60 80
Burst Length (ms)
Nu
mb
er
of
Pag
es
Server Throughput
0
10
20
30
40
50
60
70
80
90
0 20 40 60 80
Burst Length (ms)
Mbi
t/sec
Average Client Throughput
0
0.5
1
1.5
2
2.5
0 20 40 60 80
Burst Length (ms)
Mb
it/s
ec
16
Demo RST reflection and tuned square wave attack (60 ms—200 ms) Objectives:
Illustrate ease of experimental setup with our tool on DETER Identify attack effects on application-level and network-level
metrics at multiple nodes Identify attack effects on ManHunt
Experiment timeline (in seconds): 0 quagga/zebra router setup 220 host setup 223/224 start WebStone and replay 274 RST reflection begins 474 RST reflection ends 524 square wave begins 674 square wave ends 900 end of demo
17
Lessons Learned Insights into sensitivity to emulation environment
Some effects we observe may not be observed on actual routers and vice versa (architecture and buffer sizes)
Emulab and DETER results significantly differ for the same test scenario (CPU speed)
Limit on the degree of router nodes, delays, bandwidths
Difficulties in testing third party products Products (hardware or software) connect to hubs, switches, or routers
Layer 2/layer 3 emulation and automatic discovery/allocation can simplify DETER use for testing third party mechanisms
Due to licenses, we need to control machine selection in DETER Windows XP is required to test some products, e.g., Sentivist
administration interface Difficult to evaluate performance when mechanism is a black box
e.g., cannot mark attack traffic and must solely rely on knowledge of attack
18
Plans for Years 2 and 3
Formulate a testing methodology for DETER: Design increasingly high fidelity experiments and
better tools to be made available to the DETER/EMIST teams
Identify simulation/emulation artifacts Understand the impact of scale, including topology,
and statistical properties of traffic Gain better insight into the phenomenology of
attacks/defenses including their second-order effects, and how each is affected by experimental parameters
Develop a taxonomy of testable claims that security mechanisms make, and map each class of claims into realistic experiments and metrics to validate such claims
19
Summary
Identified challenges when testing third party mechanisms, providing feedback on requirements to DETER testbed design team
Understood the design of high fidelity experiments (e.g., topology, dynamic routing, interactive traffic)
Contributed to the collection of EMIST/DETER tools: experimental setup, attack mix, and measurement tools
Proved the power of the DETER testbed by presenting a subset of representative experiments