Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A...

30
1 Experience, Learning Processes, and Cyber Security Cleotilde (Coty) Gonzalez & Noam Ben-Asher Dynamic Decision Making Laboratory (www.cmu.edu/ddmlab) Social and Decision Sciences Department Carnegie Mellon University

Transcript of Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A...

Page 1: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

1

Experience, Learning Processes, and Cyber Security

Cleotilde (Coty) Gonzalez & Noam Ben-Asher

Dynamic Decision Making Laboratory (www.cmu.edu/ddmlab)

Social and Decision Sciences Department Carnegie Mellon University

Page 2: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

– To establish a theoretical model of decision making in cyber-security situations that answers questions such as:

• How do humans recognize and process possible threats? • How do humans recognize, process and accumulate information to make

“Attack”/”Defend” decisions? • How do human risk perception and tendencies influence “Attack”/”Defend”

decisions? – To provide a computational cognitive model of human decision

making processes in cyber-security situations, such that: • Addresses challenges of cyber-security while accounting for human cognitive

limitations • Provide better understanding of cyber-attacks from the human information

processing perspective, in order to predict cyber attacks • Suggest approaches to investigate courses of action and the effectiveness of

counter-measures

2

Research Objectives

Page 3: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

• Laboratory Experiments: – e.g., The “IDS security game”:

Study the dynamic process of decisions from experience

– “Cyber Warfare game”: Broader view of cyber security and relevant factors

• Cognitive Modeling: – Computational representations of

human experiential choice process into an Instance-Based Learning (IBL) model (Gonzalez et al., 2003)

– Scaling up IBL models to multi-agent environments, where each “agent” is represented with cognitive fidelity

DDMLab’s Research Approach

3

Involves comparison of data from: computational cognitive models and from humans, both performing the same task

Page 4: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

4

Main projects (since last review)

Experimental work 1. Intrusion Detection in Cyber Security: A Comparison of Experts and

Novices (Ben-Asher & Gonzalez, 2013, under review)

2. Detecting Cyber threats: Effects of similarity and feedback on detection success (Ben-Asher & Gonzalez, in preparation)

Modeling work – Set aside funds (in collaboration with Nancy Cooke & Prashanth Rajivan)

3. Understanding Cyber Warfare: Scaling up IBL Models (Ben-Asher, Rajivan, Cooke, & Gonzalez, in preparation)

Page 5: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices • High volume of intrusion alerts and high probability of false alerts make the identification process very challenging for human cognitive capabilities. • Questions:

– How much and what kind of knowledge (theoretical and practical) is needed to be successful in detecting cyber attacks?

– How do individuals differ in the detection of different types of cyber attacks?

– Can we understand what is the “experience gap” between experts and novices?

Page 6: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

A simplified but realistic network (Lye and Wing, 2005)

• A router routes internet traffic to and from the local network • Firewall prevents unwanted connections • The network has 2 subnetworks (one public web server and other private) • The Web server runs an HTTP server and an FTP server for serving Web

pages and data. • It is accessible by the public through the Internet.

Page 7: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Types of attack and network scenarios

Example of an attack: Stealing information from a workstation 1. Normal operation 2. Attack httpd service 3. Penetrate through the hacked httpd

service 4. Install sniffer to collect passwords 5. Hack into a workstation 6. Steal information 7. Create additional damage by

sabotaging the network

4 different types of cyber attacks which represent different intentions of an attacker:

Denial of Service (DoS) Steal Information Install Sniffer Deface a website

Adapted from Lye & Wing (2005)

Page 8: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

The IDS Security Game

Page 9: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Data collection

• 55 Novices (CMU students); 20 Experts (Practitioners in network security)

• Novices received 10 network scenarios (9 represent a cyber attack and 1 does not)-Experts received 3/10 (randomly selected) scenarios

• Each participant classified each network event as “Threat” or “No threat” using our “IDS security game” (Novices) or the on-line tool (Experts) and decided if there was an “attack” or not.

• Score (and payment) was based on performance and the following payoff matrix

• Participants completed a 20 questions survey that assessed practical and theoretical knowledge in information security

Page 10: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

• Theoretical experience – What does SSL stand for?

• Systematic Security Level • Secret Service Logarithm • Secure Sockets Layer • Secret Secure Location • Standard Security Level

• Practical experience – How many years of working experience do you have in network

operation and security area? • 10+ years • 5-10 years • 1-5 years • A few months (less than a year) • None

Questionnaire: Theoretical and Practical experience in information security (examples)

10

Page 11: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Working experience in network operation and security

None 93%

Less than 1 year 7%

Novices

None 5%

Less than 1 year 15%

1-5 years 20%

5-10 years 25%

10+ years 35%

Experts

Page 12: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Practical and Theoretical Knowledge in Information Security

Mean Knowledge (0-1) Novices Experts

Theoretical 0.40 0.95

Practical 0.32 0.61

Novices Experts

Page 13: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Attack decision

Rate

Identification Novices (d'=1.52) Experts (d'=1.6)

Hit 0.68 0.67

False alarm 0.14 0.12

Hit

Rat

e

Page 14: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Identification of Threats – Per Attack Type

Rate

Identification Novices (d'=.78) Experts (d'=1.18)

Hit 0.44 0.55

False alarm 0.18 0.15

Hit

Rat

e

Page 15: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

What attributes do experts/novices use in making decisions?

• Simple ground truth:

– Alert + Load + Operation • Alert: There is an increase in traffic between web server and file server • Description: The web server is running ftpd and httpd services. The traffic is 10 Mbps between

internet and web server, 10 Mbps between web server and file server, and 3.3 Mbps between workstation and web server . An ftpd operation has been executed.

– Alert + Operation • Alert: ftpd has stopped running on web server • Description: The web server is running httpd services. The traffic is 3.3 Mbps between internet and

web server, 3.3 Mbps between web server and file server, and 3.3 Mbps between web server and workstation. An ftpd operation has been executed.

– Alert + Load • Alert: ftpd seems to be attacked • Description: The web server is running ftpd and httpd services. The traffic is 6.7 Mbps between

internet and web server, 6.7 Mbps between web server and file server, and 3.3 Mbps between web server and workstation.

15

Page 16: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Correct Detection and Event’s Attributes

• Alert, Network Load, and Operation events were similarly detected by experts (74%) and novices (64%)

• Alert, Network Load events were detected slightly better by experts (57%) than novices (49%)

• Alert and Operation were more likely to be detected by experts (47%) than by novices (33%).

16

Page 17: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

False Alarms and Event’s Attributes

• When the description of the event included only one possible indicator for a threat (i.e., Alert or Network Load or Operation), we find no differences between experts and novices.

• However, novices were more likely to generate false alerts by classifying benign Network Load and Operation events as threats compared to experts.

• Experts were aware that the operation can be the cause of the load and that load combined with operation can be a benign network activity that does not represent a threat.

17

Page 18: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Main conclusions

• Experience is characterized by: consistently accurate theoretical knowledge, and high but variable practical knowledge.

• No difference in the accuracy of attack decisions between Experts and Novices; but identification of threats is better for Experts than Novices in some scenarios.

• More cues are more diagnostic of a threat but Experts are better at detecting threats, particularly with less cues.

• Experts were better in making connections between the attributes of a network event:

– Both Experts and Novices were sensitive to the network load. – However, Experts were more aware of the relationship between operations and alerts

(classified as threats) – Experts were more aware of the relationship between operations and load (Not

classified as threats) – Experts know WHY a threat is a threat, while Novices don’t

Page 19: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

2. Detecting Cyber threats: The effect of similarity and feedback on detection

• Questions: – What level of feedback speeds up learning? – What level of feedback helps in adaptation to novel types

of attack? – How similar the types of attack need to be in order for

their training to be effective? – How influential is the experience acquired with one type

of attack to perform well in a different type of attack?

Page 20: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Two Similarity Levels (between training and transfer phases) and Two Feedback levels (during training)

• Similarity (based on Lye and Wing, 2005): – High similarity:

• Same type of attack (e.g., stealing information) but the attacker penetrates the network by exploiting vulnerabilities in two different ways of the web server (httpd and ftpd).

– Low similarity: • Different type of attack (e.g., stealing information and install a sniffer)

• Feedback (based on Gonzalez, 2005): – Aggregated Feedback

• presents the total points earned by the classification. – Detailed Feedback

• Indicate for each event whether is was a Hit, Miss, False Alarm or Correct Rejection using a color scheme

Page 21: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Only TOTAL score shown

Aggregated Feedback

Page 22: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Detailed Feedback

Breakdown of scores shown: hits, misses,

correct rejections, false

alarms

The events are colored accordingly

Page 23: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Data collection

• N=160, with 46% women, Mage=23.04, SDage=2.90 • A 2X2 experimental design (Feedback X Similarity). • Each participant received:

– 8 training trials of the same scenario with one of two type of feedback (Global, Detailed).

– 2 transfer trials of the same scenario without feedback

• Payment was based on performance and the following payoff matrices – Training:

– Transfer:

Page 24: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Training with detailed feedback improves Hits while keeping the FA rate constant

Hits and FA during training

Page 25: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Detection performance during training based on Event’s Attributes

• Malicious events were detected more accurately during training with Detailed feedback

25

Page 26: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Transfer detection performance: effects of training feedback and scenario similarity

26

Detailed feedback during training is key to success to adapt to novel and familiar situations with no feedback

Page 27: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Detection performance at transfer based on event’s attributes

27

Page 28: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

Main conclusions

• Detailed feedback results in better detection of threats during training

• Detailed feedback during training is key to success at transfer, when feedback is absent

• Detailed feedback during training helped participants in learning diagnosticity of attributes

• More cues are more diagnostic of a threat; and having received detailed feedback during training results in better detection of threats with less cues.

Page 29: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

• 5 journal manuscripts during current report period: – Risk tolerance and timing of adversarial threats: Modeling detection with Instance-

Based Learning Theory (Dutt, Ahn, Gonzalez, 2012) – From Individual Decisions from Experience to Behavioral Game Theory: Lessons for

Cybersecurity (Gonzalez, 2012) – Detecting Cyber threats: Comparing Experts and Novices (Ben-Asher & Gonzalez,

Under review) – Detecting Cyber threats: Effects of similarity and feedback on detection success

(Ben-Asher & Gonzalez, in preparation) – Learning and Risk Taking in Information Security: The effect of Intensity of Rare

Failures (Ben-Asher & Gonzalez, in preparation) • Students/Post docs supported

– 1 post-doc (part-time) • Collaborative efforts

– Arizona State University (Nancy Cooke ) – Set aside funds..

2012-2013 - Accomplishments (Statistics)

29

Page 30: Experience, Learning Processes, and Cyber Security1. Intrusion Detection in Cyber Security: A Comparison of Experts and Novices (Ben-Asher & Gonzalez, 2013, under review) 2. Detecting

30

FY14 plan

• Accomplish manuscripts currently in process • Renewal of funding for collaboration:

• Generate an experimental platform using the Cyberwar game • Data collection from human players and comparison to model’s

predictions in the Cyberwar game • Review parameters and assumptions of the IBL-CyberWar

simulation • More of Social Science Theory (eg. Transactive memory, coalition formation,

wisdom of crowds)Demonstrations of dynamics of power and assets in a cyberwar scenario

• Collaborative publications (Ben-Asher, Rajivan, Cooke & Gonzalez)