Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of...

55
Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE Arnab Ray, PhD Adjunct Associate Professor of Computer Science University of Maryland Senior Research Scientist Fraunhofer CESE 6/10/2014 ©2014 University of Maryland and Fraunhofer USA

Transcript of Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of...

Page 1: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA

Model Checking for Security

Rance Cleaveland, PhDProfessor of Computer Science

University of MarylandExecutive and Scientific Director

Fraunhofer CESE

Arnab Ray, PhDAdjunct Associate Professor of Computer Science

University of MarylandSenior Research Scientist

Fraunhofer CESE

6/10/2014

Page 2: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 2

A Common Problem

• You have developed a software system• You want to ensure it is “secure”• What do you do?– Think hard about possible attack scenarios– Try to use them to hack your system– If you are unsuccessful at hacking, then system

must be secure, right?

6/10/2014

Page 3: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 3

Wrong

• What if you missed a possible attack?• Did you implement the attacks correctly?• Did you correctly understand the possible

attack results?

6/10/2014

Page 4: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 4

This Issue Is Not Just Security

• The General Correctness Problem– Given system, specification– Does the system meet the specification?

• Confounding problem in computing!– State of the art: test

• Devise tests based on spec, system• Run tests• Analyze results

– But• Are the tests extensive enough?• Were the results understood correctly?

• Consequence: buggy (insecure) systems

6/10/2014

Page 5: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 5

Formal Methods

• A different approach to correctness– Correctness problem is same: Given system, spec

determine whether system meets spec– However• System, spec must be mathematically precise• Establishing correctness = proof

• Proof?Mathematically complete argument that system meets specification

6/10/2014

Page 6: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 6

Proofs Are Hard

• Yes!– They also do not “scale”– They require deep expertise– …

• Maybe they can be automated?– Automated theorem provers aim for this– Problem: undecidability; it is impossible to prover

everything automatically

6/10/2014

Page 7: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 7

Model Checking

• Not all theorems can be proved automatically• … but maybe some can?• Model checking– Automated proofs of correctness– Originally: finite-state systems

6/10/2014

Page 8: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 8

This Tutorial

• Applying model checking to verify security properties

• Structure– Traditional model checking– Model checking and cryptographic protocols– Model checking and source-code security

6/10/2014

Page 9: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

6/10/2014 ©2014 University of Maryland and Fraunhofer USA 9

INTRODUCTION TO MODEL CHECKING

Page 10: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 10

Correctness Problems and Model Checking

• Model checking automates proof construction for correctness problems

• Correctness problems need systems, specs• What are systems, specs for model checking?– Systems: finite-state Kripke structures– Specs: temporal logic

6/10/2014

Page 11: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 11

Kripke Structures

Atomic propositions encode “basic properties” of each state

6/10/2014

{idle1, idle2}

{exec1, idle2} {idle1, exec2}

“state”

“transition”

“initial state”

“atomic propositions”

Page 12: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 12

Kripke Structures?

• Come from system descriptions– Models– Code

• In case of code, states correspond to assignments of values to variables

• Atomic propositions are application-specific

6/10/2014

Page 13: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 13

Example: Dining Philosophers

6/10/2014

Martin Heidegg

er

Immanuel Kant

David HumeG.W.F. Hegel

Ludwig Wittgenst

ein

Page 14: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 14

Kripke Structures and Dining Philosophers: States

• What are the states?– Philosophers can be “eating” or “thinking” (aka “not eating”)– (Chop)sticks can either be “available” or “not available”– Philosopher can either have or not have left stick, right stick

• Can model this using 4 arrays, each with 5 bits!– P[i] = 1 (true) iff philosopher i is eating (Kant = 0, Heidegger =

1, etc.)– S[i] = 1 iff stick i is available (stick i is to right of philosopher i)– HR[i] = 1 iff philosopher i has right stick– HL[i] = 1 iff philosopher i has left stick

• Total # of potential states = 25 * 25 * 25 * 25 = 1,048,576

6/10/2014

Page 15: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 15

Dining Philosophers: States and Initial State

• Constraints– Philosopher can be “eating” if and only if s/he has both sticks (i.e.

P[i] == 1 implies HR[i] == 1 && HL[i] == 1)– If philosopher has stick, stick must be “not available” (i.e. HR[i] ==

1 implies S[i] == 0)– Constraints reduce # of possible states

• Initial state– No philosophers eating; every fork available– So initial state is

P0 = [0,0,0,0,0]

S0 = [1,1,1,1,1]

HR0 = [0,0,0,0,0]

HL0 = [0,0,0,0,0]

6/10/2014

Page 16: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 16

Atomic Propositions

• Domain / problem dependent!• In what follows, what we care about is

whether or not philosophers are eating• So, atomic propositions have form eatingi,

which holds iff ith philosopher is eating• eatingi is true in a state iff P[i] == 1!

6/10/2014

Page 17: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 17

Dining Philosophers Transitions

• Defined using if … then rules (“actions”)– If if part is true in a state …– Then transition to state defined by then part exists

• E.g. “Pick up right chopstick”– if HR[i] == 0 && HL[i] == 0 && S[i] == 1

then HR[i], S[i] := 1,0– This rule determines when philosopher i can pick up

right stick: s/he can’t have either stick, and the right stick must be available

– New state is same as old, except that HR[i] changed to “1”, S[i] changed to “0”

6/10/2014

Page 18: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 18

Example Transition

• Ruleif HR[i] == 0 && HL[i] == 0 && S[i] == 1then HR[i], S[i] := 1,0

• In state Q, consider i = 2 case– Condition of rule is satisfied: H[2] == 0 && HL[2] == 0 && S[2] == 1– Rule indicates there is thus a transition to Q’, where S’[2] == 0 and HR’[2] == 1

6/10/2014

P: [1,0,0,0,0]S: [0,0,1,1,1]

HL: [1,0,0,0,0]HR: [1,0,0,0,0]

P’: [1,0,0,0,0]S’: [0,0,0,1,1]

HL’: [1,0,0,0,0]HR’: [1,0,1,0,0]

Q Q’

Page 19: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 19

A Complete Rule Set• “Pick up right stick”

if HR[i] == 0 && HL[i] == 0 && S[i] == 1then HR[i], S[i] := 1,0

• “Pick up left stick, eat”if HR[i] == 1 && HL[i] == 0 && S[(i+1) mod 5] == 1then HL[i], S[(i+1) mod 5], P[i] := 1, 0, 1

• “Stop eating”if P[i] == 1then P[i] := 0

• “Put down right stick”if P[i] = 0 && HR[i] == 1 && HL[i] == 1then HR[i], S[i] := 0, 1

• “Put down left stick”if P[i] == 0 && HR[i] == 0 && HL[i] == 1 && S[(i+1) mod 5] == 1then HL[i], S[(i+1) mod 5] := 0, 1

Notes• Other rule sets also

possible!• This rule set enforces

following sequence of actions on philosopher i– Pick up right stick– Pick up left stick and eat– Stop eating– Put down right stick– Put down left stick

6/10/2014

Page 20: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 20

A Kripke Structure for Dining Philosophers

• Entire Dining Philosophers system consists of states, rules, atomic propositions

• Kripke structure can be generated automatically from these– List all possible states– For each state, add all transitions defined by rules– Annotate each state with atomic propositions

6/10/2014

Page 21: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 21

Temporal Logic

• Notation for writing desired properties of Kripke structures• Starting point: propositional logic

– Atomic propositions (e.g. eatingi)– &&, ||, ! (not), ⇒ (implies), etc.– Using propositional logic you can write properties of states, e.g.

eating0 && !eating3

• Additions: operators for talking about time– F: “eventually”– G: “always”

• Note– There are several variants of temporal logic– For pedagogical completeness: we are using “linear-time”

6/10/2014

Page 22: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 22

Using Temporal Logic

• G (idle1 || idle2)“It is always the case that process 1 or process 2 is idling”

• G (idle1 ⇒ F exec1)“It is always the case that if process 1 is idling it will eventually execute”

• G (F eating0)“It is always the case that eventually, philosopher 0 will be eating”

• G F (eating0 || ··· || eating4)“It is always the case that eventually, some philosopher will be eating”

6/10/2014

Page 23: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 23

When Does a Kripke Structure Satisfy at TL Formula?

Intuitively: when every path starting from the initial state makes the formula true

6/10/2014

Page 24: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 24

When Do Formulas Hold?

• G (idle1 || idle2) yes

• G (idle1 ⇒ F exec1) no

6/10/2014

{idle1, idle2}

{exec1, idle2} {idle1, exec2}

Page 25: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 25

When Do Formulas Hold (cont.)

• Consider properties for philosophers– G F eating0

– G F (eating0 || ··· || eating4)

• Do they hold?– Depends on the transitions defined by the rules!– Our formulation: neither holds!

• All philosophers pick up right stick• No one can ever eat

– Can make second hold by e.g. having even philosophers pick up right fork first while odd pick up left fork first

6/10/2014

Page 26: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 26

Decidability

• For any finite-state Kripke structure …• … and any TL formula …• … can fully automatically prove whether or not

the structure satisfies the formula!• Also:– If formula is not satisfied …– … can compute a counterexample (“error path”)

6/10/2014

Page 27: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 27

Discussion

• Sometimes automata rather than temporal formulas are used as specifications

• Variants of Kripke structures also appear– Transitions sometimes labeled with events– States sometimes not labeled with atomic props

• Tools– NuSMV– SPIN– Concurrency Workbench

6/10/2014

Page 28: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

6/10/2014 ©2014 University of Maryland and Fraunhofer USA 28

MODEL CHECKING AND CRYPTOGRAPHIC PROTCOLS

Page 29: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 29

Model Checking and Security

• Model checking detects (automatically) if given finite-state system satisfies TL formulas

• How can this be used to check system security?– Formulate desired security properties in TL– Model system and possible intruders– Check if system + environment satisfies formulas!• If so: secure (may need further hand proof)• If not: counterexample represents possible attack

6/10/2014

Page 30: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 30

Needham-Schroeder Protocol

• Use: establish authenticated connection between A (initiator) and B (responder)

• Assumptions– Perfect public-key cryptographic infrastructure– Perfect “nonce” (random-number) generation

• Attackers can:– Forge addresses– Intercept messages– Generate spurious messages

• Attackers can not:– Decode messages for which they do not have keys– Modify encrypted data

6/10/2014

Page 31: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 31

Needham-Schroeder (Simplified)

• NA, NB: nonces• Message format: ⟨src, dest, payload⟩• ⦃ A, NA⦄B: encode payload (A, NA) using public key of B

6/10/2014

A B

Generate NA

Generate NB

⟨A, B, ⦃ A, NA⦄B⟩

⟨B, A, ⦃NA, NB⦄A⟩

⟨A, B, ⦃NB⦄B⟩

Page 32: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 32

Goal of N-S Protocol

• At end of protocol, A, B have proven their identities to each other

• What would an attacker try to do?– An attacker would try to masquerade as A or B– An attack is successful if an attacker successfully

convinces A / B that it is B / A

6/10/2014

Page 33: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 33

Model Checking and N-S

• N-S has a vulnerability• Model checking was used to uncover it• The remainder of this section– Expressing the property that N-S should ensure– Modeling N-S so that attack possibilities can be

exposed

6/10/2014

Page 34: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 34

Specifying N-S in Temporal Logic

• What is N-S intended to guarantee?– If A initiates with B and no errors occur …– … then A, B can trust identity of other

• In temporal logic, can express this as follows– Atomic propositions• in(A,B): A initiates with B• err(A,B): error detected; no connection established• succ(A,B): protocol succeeds; authentication achieved

– Then following expresses correctness:G( in(A,B) F (err(A,B) || succ(A,B) ⇒ )

6/10/2014

Page 35: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 35

Modeling N-S via Kripke Structures

• Two issues for modeling– How to get Kripke Structures?– How to model attacker possibilities?

• Kripke structures– State consists of nonce values, message buffers for

each agent, current stages of protocol– Rules define how to manipulate buffers, construct

messages, etc.

6/10/2014

Page 36: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 36

States

• For each protocol instance C in protocol:– q[C] is message queue for C– m[C] is set of messages received so far by C– n[C] is the current nonce generated by C (or 0)– s[C] is the current state of the protocol

• Protocol states can be:– {i0:C, i1:C, i2:C, i3:C}: initiator states

(parameterized by instance C)– {r0, r1:C, r2:C, r3:C, r4:C}: responder states

(parameterized by instance C)6/10/2014

Page 37: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 37

Example

• q[A] = [⟨B, A, ⦃NA, NB⦄A⟩]; q[B] = []

• m[A] = {}; m[B] = {⟨A, B, ⦃ A, NA⦄B⟩}

• n[A] = NA; n[B] = NB

• s[A] = i1:B; s[B] = r3:A

6/10/2014

A B

Generate NA

Generate NB

⟨A, B, ⦃ A, NA⦄B⟩

⟨B, A, ⦃NA, NB⦄A⟩

⟨A, B, ⦃NB⦄B⟩

Page 38: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 38

Transitions

• Also specified using actions: if … then …• Example: “Receive from responder”

if s[C] == r3:B && n[C] == NA && q[C] == ⟨B, A, ⦃NA, NB⦄A ⟩ · q’

thens[C], q[C], q[B], m[C] := r4:B, q’, q[B] · ⟨A, B, ⦃NB⦄B⟩, m[C] ⋃ {⟨B, A, ⦃NA, NB⦄A⟩}

6/10/2014

Generate NB

⟨B, A, ⦃NA, NB⦄A⟩

⟨A, B, ⦃NB⦄B⟩

Page 39: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 39

Attack Modeling

• In cryptographic protocols, agents besides “principals” can intrude– Messages can be intercepted, repurposed– Non-encrypted data can be forged, etc.

• How to model?– Idea: introduce “maximally nondeterministic” intruder

• Intruder maintains sets of protocol states that it is in• It can perform any permitted attacker operation (can retrieve messages

from any message q, send any message, etc.)• This third-party can interact with “good” initiator, respondent

– Model checking can be run on system with intruder– If property is violated, tools return a counterexample (execution

path) representing an attack!

6/10/2014

Page 40: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 40

An N-S Attack Discovered via Model Checking

• When I initiates session with B: in(I,B)• When B (thinks) it successfully finishes: succ(A,B)• So G( in(I,B) F(err(I,B) || ⇒ succ(I,B) ) is violated

6/10/2014

A B

Generate NA

Generate NB

⟨A, I, ⦃A, NA⦄I⟩

⟨B, A, ⦃NA, NB⦄A⟩

⟨A, I, ⦃NB⦄I⟩

I

⟨A, B, ⦃A, NA⦄B⟩

⟨I, A, ⦃NA, NB⦄A⟩

⟨A, B, ⦃NB⦄B⟩

Page 41: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

6/10/2014 ©2014 University of Maryland and Fraunhofer USA 41

MODEL CHECKING AND SOURCE-CODE SECURITY

Page 42: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 42

Model Checking Code

• Inputs:– Code written in a programming language (e.g. C)– A language for specifying properties (either logic

or automata)• Outputs– Yes/ No answer (typically “conservative”, a “yes” is

a “yes” and a “no” is a “maybe” [false positives])– A counterexample trace

6/10/2014

Page 43: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 43

Time of Check, Time of Use

The software checks the state of a resource before using that resource, but the resource's state can change between the check and the use in a way that invalidates the results of the check.

Common Weakness Enumeration (MITRE)

6/10/2014

Page 44: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 44

TOCTOU (TOCTTOU) Explained

if (access("file", W_OK) != 0) //access checks //against real user-id instead of effective user-id{ exit(1); } fd = open("file", O_WRONLY); write(fd, buffer, sizeof(buffer));

6/10/2014

Fragment of Setuid program

Page 45: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 45

The Attack

6/10/2014

Victim Attacker

if (access("file", W_OK) != 0) { exit(1); }

fd = open("file", O_WRONLY); // Actually writing over /etc/passwd write(fd, buffer, sizeof(buffer));

// After the access check symlink("/etc/passwd", "file"); // Before the open, "file" points to the password database // //

Page 46: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 46

TOCTTOU Property

6/10/2014

Page 47: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 47

Chroot() Vulnerability

• chroot(dir) confines program to filesystem under dir– chroot(“var/jail”) effectively replaces all

references to “/” (i.e. root) to “var/jail” • Used to sandbox a program so that whatever

damage it may be made to do remains confined within the jail

6/10/2014

Page 48: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 48

Chroot Vulnerability

• Program P’s current directory is /home/joe• P calls chroot(“var/jail”) • Attacker compromises P and makes it execute

chdir(“../..”)• The attacker will now be in at the real root

directory• Problem chroot creates a jail but does not

move you into the jail – If you are outside of jail, you may stay outside !

6/10/2014

Page 49: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 49

Basic Solution

• chroot (dir); chdir(“/”); – Change directory to the jail after you have defined it

6/10/2014

Page 50: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 50

Chroot Property

6/10/2014

Page 51: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 51

Basic Principle

• Take property V (e.g. absence of TOCCTOU) to be proven

• Use the property V & different abstraction techniques to abstract the program P under consideration– Abstract away variables and actions not relevant to V– Example abstraction technique: Use length of array as an

abstraction for the contents of the array • Use model-checker to check if P violates V• If answer=yes, we are done • If answer=no, examine counterexample to check if it

is a valid trace 6/10/2014

Page 52: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 52

Static Analysis Is a Type of Model Checking

• Example static analysis problem: Searching for buffer overflow

• Property: For all paths through the program and for every statement in the program, there are no out-of-bound memory accesses

6/10/2014

Page 53: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 53

Weakness: False Positives

• Because of abstraction, one gets false positives– Basic reason: Spurious paths are introduced into

the model program P’ that are not in P because of the abstractions • We must always be conservative, else we risk giving

wrong answers

• False positives need to be winnowed out manually– For large complex code-bases, time-consuming

6/10/2014

Page 54: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 54

Moving Forward

• Use advanced techniques like CEGAR (Counter-example guided abstract refinement) to automatically winnow out a significant number of counter-examples

• Come up with better (i.e. more precise) property-driven abstraction techniques that reduce false positives in the first place

6/10/2014

Page 55: Model Checking for Security Rance Cleaveland, PhD Professor of Computer Science University of Maryland Executive and Scientific Director Fraunhofer CESE.

©2014 University of Maryland and Fraunhofer USA 55

Thanks!

Rance [email protected]

Arnab [email protected]

6/10/2014