CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek...

114
CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan University Slides based on Security in Computing. Third Edition by Pfleeger and Pfleeger. Using some slides courtesy of: Prof. Aaron Striegel — course taught at U. of Notre Dame Prof. Barbara Endicott-Popovsky and Prof. Deborah Frincke (U. Idaho) — taught at U. Washington Prof. Jussipekka Leiwo — taught at Vrije Universiteit (Free U.), Amsterdam, The Netherlands Slides not created by the above authors are © 2006 by Leszek T. Lilien Requests to use original slides for non-profit purposes will be gladly granted upon a written request.
  • date post

    19-Dec-2015
  • Category

    Documents

  • view

    219
  • download

    3

Transcript of CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek...

Page 1: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

CS 5950/6030 –Computer Security and Information Assurance

Section 3: Program SecurityDr. Leszek Lilien

Department of Computer ScienceWestern Michigan University

Slides based on Security in Computing. Third Edition by Pfleeger and Pfleeger.Using some slides courtesy of:

Prof. Aaron Striegel — course taught at U. of Notre DameProf. Barbara Endicott-Popovsky and Prof. Deborah Frincke (U. Idaho) — taught at U.

WashingtonProf. Jussipekka Leiwo — taught at Vrije Universiteit (Free U.), Amsterdam, The Netherlands

Slides not created by the above authors are © 2006 by Leszek T. LilienRequests to use original slides for non-profit purposes will be gladly granted upon a written request.

Page 2: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

2© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (1)3.1. Secure Programs – Defining & Testing

a. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws

3.2. Nonmalicious Program Errorsa. Buffer overflowsb. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws

Page 3: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

3© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (2)3.3. Malicious Code

3.3.1. General-Purpose Malicious Code incl. Viruses

a. Introduction b. Kinds of Malicious Codec. How Viruses Workd. Virus Signaturese. Preventing Virus Infectionsf. Seven Truths About Virusesg. Case Studiesh. Virus Removal and System Recovery After Infection

3.3.2. Targeted Malicious Codea. Trapdoorsb. Salami attackc. Covert channels

Page 4: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

4© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (3)

3.4. Controls for Securitya. Introductionb. Developmental controls for securityc. Operating System controls for securityd. Administratrive controls for securitye. Conclusions

Page 5: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

5© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3. Program Security (1) Program security –

Our first step on how to apply security to computing

Protecting programs is the heart of computer security All kinds of programs, from apps via OS, DBMS, networks

Issues: How to keep pgms free from flaws How to protect computing resources from pgms

with flaws

Issues of trust not considered: How trustworthy is a pgm you buy? How to use it in its most secure way?

Partial answers: Third-party evaluations Liability and s/w warranties

Page 6: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

6© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Program Security (2)

Outline:3.1. Secure Programs – Defining and Testing3.2. Nonmalicious Program Errors3.3. Malicious Code

3.3.1. General-Purpose Malicious Code incl. Viruses

3.3.2. Targeted Malicious Code3.4. Controls Against Program Threats

Page 7: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

7© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.1. Secure Programs - Defining & Testing

Outlinea. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws

[cf. B. Endicott-Popovsky]

Page 8: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

8© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

a. Introduction (1) Pgm is secure if we trust that it provides/enforces:

Confidentiality Integrity Availability

What is „Program security?”Depends on who you ask

user - fit for his task programmer - passes all „her” tests manager - conformance to all specs

Developmental criteria for program security include:

Correctness of security & other requirements Correctness of implementation Correctness of testing

Page 9: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

9© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Introduction (2) Fault tolerance terminology:

Error - may lead to a fault Fault - cause for deviation from intended

function Failure - system malfunction caused by fault

Note: [cf. A. Striegel]

Faults - seen by „insiders” (e.g., programmers)Failures - seen by „outsiders” (e.g., independent testers,

users)

Error/fault/failure example: Programmer’s indexing error, leads to buffer overflow fault Buffer overflow fault causes system crash (a failure)

Two categories of faults w.r.t. duration [cf. A. Striegel]

Permanent faults Transient faults – can be much more difficult to diagnose[cf. A. Striegel]

Page 10: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

10© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Judging S/w Security by Fixing Faults An approach to judge s/w security:

penetrate and patch Red Team / Tiger Team tries to crack s/w

If you withstand the attack => security is good Is this true? Rarely.

Too often developers try to quick-fix problems discovered by Tiger TeamQuick patches often introduce new faults due to:

Pressure – causing narrow focus on fault, not context

Non-obvious side effects System performance requirements not allowing

for security overhead

[cf. A. Striegel]

Page 11: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

11© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

c. Judging S/w Security by Testing Pgm Behavior (1) Better approach to judging s/w security:

testing pgm behavior Compare behavior vs. requirements (think testing/SW

eng) Program security flaw = = inappropriate behavior caused by a pgm fault or

failure Flaw detected as a fault or a failure

Important: If flaw detected as a failure (an effect), look for the underlying fault (the cause)

Recall: fault seen by insiders, failure – by outsiders If possible, detect faults before they become failures

Note:Texbook defines flaw-vulnerability-flaw in a circular way

– a terminology soup!

Page 12: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

12© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Judging S/w Security by Testing Pgm Behavior (2)

Any kind of fault/failure can cause a security incident

Misunderstood requirements / error in coding / typing error

In a single pgm / interaction of k pgms Intentional flaws or accidental (inadvertent) flaws

Therefore, we must consider security consequences for all kinds of detected faults/failures

Even inadvertent faults / failures Inadvertent faults are the biggest source of

security vulnerabilities exploited by attackers Even dormant faults

Eventually can become failures harming users

Page 13: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

13© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Judging S/w Security by Testing Pgm Behavior (3)

Problems with pgm behavior testing Limitations of testing

Can’t test exhaustively Testing checks what the pgm should do Can’t test what the pgm should not do

i.e., can’t make sure that pgm does only what it should do – nothing more

Complexity – malicious attacker’s best friend Too complex to model / to test Exponential # of pgm states / data combinations a faulty line hiding in 10 million lines of code

Evolving technology New s/w technologies appear Security techniques catching up with s/w technologies

[cf. A. Striegel]

Page 14: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

14© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

d. Judging S/w Security by Pgm Security Analysis Best approach to judging s/w security:

pgm security analysis

Analyze what can go wrong At every stage of program development!

From requirement definition to testing After deployment

Configurations / policies / practices

Protect against security flaws Specialized security methods and techniques Specialized security tools

E.g., specialized security meth/tech/tools for switching s/w

[cf. B. Endicott-Popovsky]

Page 15: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

15© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

e. Types of Pgm Flaws Taxonomy of pgm flaws:

Intentional Malicious Nonmalicious

Inadvertent Validation error (incomplete or inconsistent)

e.g., incomplete or inconsistent input data Domain error

e.g., using a variable value outside of its domain Serialization and aliasing

serialization – e.g., in DBMSs or OSs aliasing - one variable or some reference, when changed,

has an indirect (usually unexpected) effect on some other data

Note: ‘Aliasing’ not in computer graphics sense! Inadequate ID and authentication (Section 4—on

OSs) Boundary condition violation Other exploitable logic errors[cf. B. Endicott-Popovsky]

Page 16: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

16© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.2. Nonmalicious Program Errors Outline

a. Buffer overflowsb. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws

Page 17: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

17© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

a. Buffer Overflows (1) Buffer overflow flaw — often inadvertent

(=>nonmalicious) but with serious security consequences

Many languages require buffer size declaration C language statement: char sample[10]; Execute statement: sample[i] = ‘A’;

where i=10 Out of bounds (0-9) subscript – buffer overflow

occurs Some compilers don’t check for exceeding bounds

C does not perform array bounds checking. Similar problem caused by pointers

No reasonable way to define limits for pointers

[cf. B. Endicott-Popovsky]

Page 18: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

18© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (2)

Where does ‘A’ go? Depends on what is adjacent to ‘sample[10]’

Affects user’s data - overwrites user’s data

Affects users code - changes user’s instruction

Affects OS data - overwrites OS data Affects OS code - changes OS

instruction

This is a case of aliasing (cf. Slide 26)

[cf. B. Endicott-Popovsky]

Page 19: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

19© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (3)

Implications of buffer overflow: Attacker can insert malicious data

values/instruction codes into „overflow space” Supp. buffer overflow affects OS code area

Attacker code executed as if it were OS code Attacker might need to experiment to see what

happens when he inserts A into OS code area Can raise attacker’s privileges (to OS privilege level)

When A is an appropriate instruction Attacker can gain full control of OS

[cf. B. Endicott-Popovsky]

Page 20: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

20© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (4)

Supp. buffer overflow affects a call stack areaA scenario:

Stack: [data][data][...] Pgm executes a subroutine

=> return address pushed onto stack (so subroutine knows where to return control to when finished)Stack: [ret_addr][data][data][...]

Subroutine allocates dynamic buffer char sample[10] => buffer (10 empty spaces) pushed onto stackStack: [..........][ret_addr][data][data][...]

Subroutine executes: sample[i] = ‘A’ for i = 10Stack: [..........][A][data][data][...]

Note: ret_address overwritten by A!(Assumed: size of ret_address is 1 char)

Page 21: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

21© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (5) Supp. buffer overflow affects a call stack area—CONT

Stack: [..........][A][data][data][...] Subroutine finishes

Buffer for char sample[10] is deallocatedStack: [A][data][data][...]

RET operation pops A from stack (considers it ret. addr.)Stack: [data][data][...]

Pgm (which called the subroutine) jumps to A=> shifts program control to where attacker wanted

Note: By playing with ones own pgm attacker can specify any „return address” for his subroutine

Upon subroutine return, pgm transfers control to attacker’s chosen address A (even in OS area)

Next instruction executed is the one at address A Could be 1st instruction of pgm that grants

highest access privileges to its „executor”

Page 22: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

22© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (6) Note: [Wikipedia – aliasing]

C programming language specifications do not specify how data is to be laid out in memory (incl. stack layout)

Some implementations of C may leave space between arrays and variables on the stack, for instance, to minimize possible aliasing effects.

Page 23: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

23© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (7) Web server attack similar to buffer overflow attack:

pass very long string to web server (details: textbook, p.103)

Buffer overflows still common Used by attackers

to crash systems to exploit systems by taking over control

Large # of vulnerabilities due to buffer overflows

Page 24: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

24© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Incomplete Mediation (1) Incomplete mediation flaw — often inadvertent (=>

nonmalicious) but with serious security consequences Incomplete mediation:

Sensitive data are in exposed, uncontrolled condition

Example URL to be generated by client’s browser to access

server, e.g.:http://www.things.com/order/final&custID=101&part=555A&qy=20&price=10&ship=boat&shipcost=5&total=205

Instead, user edits URL directly, changing price and total cost as follows: http://www.things.com/order/final&custID=101&part=555A&qy=20&price=1&ship=boat&shipcost=5&total=25

User uses forged URL to access server The server takes 25 as the total cost

Page 25: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

25© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Incomplete Mediation (2)

Unchecked data are a serious vulnerability!

Possible solution: anticipate problems Don’t let client return a sensitive result (like

total) that can be easily recomputed by server

Use drop-down boxes / choice lists for data input Prevent user from editing input directly

Check validity of data values received from client

Page 26: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

26© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

c. Time-of-check to Time-of-use Errors (1)

Time-of-check to time-of-use flaw — often inadvertent (=> nonmalicious) but with serious security consequences

A.k.a. synchronization flaw / serialization flaw TOCTTOU — mediation with “bait and switch” in the

middle Non-computing example:

Swindler shows buyer real Rolex watch (bait) After buyer pays, switches real Rolex to a forged one

In computing: Change of a resource (e.g., data) between time

access checked and time access used Q: Any examples of TOCTTOU

problems fromcomputing?

Page 27: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

27© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Time-of-check to Time-of-use Errors (2) ... TOCTTOU — mediation with “bait and switch” in the

middle ...

Q: Any examples of TOCTTOU problems from

computing? A: E.g., DBMS/OS: serialization problem:

pgm1 reads value of X = 10pgm1 adds X = X+ 5

pgm2 reads X = 10, adds 3 to X, writes X = 13

pgm1 writes X = 15

X ends up with value 15 – should be X = 18

Page 28: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

28© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Time-of-check to Time-of-use Errors (3)

Prevention of TOCTTOU errors Be aware of time lags Use digital signatures and certificates to „lock”

data values after checking them So nobody can modify them after check &

before use Q: Any examples of preventing

TOCTTOU fromDBMS/OS areas?

Page 29: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

29© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Time-of-check to Time-of-use Errors (4)

Prevention of TOCTTOU errors ... Q: Any examples of preventing TOCTTOU

fromDBMS/OS areas?

A1: E.g., DBMS: locking to enforce proper serialization(locks need not use signatures—fully controlled by DBMS) In the previous example:

will force writing X = 15 by pgm 1, before pgm2

reads X (so pgm 2 adds 3 to 15) OR:

will force writing X = 13 by pgm 2, before pgm1

reads X (so pgm 1 adds 5 to 13)

A2: E.g., DBMS/OS: any other concurrency control mechanism enforcing serializability

Page 30: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

30© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

d. Combinations of Nonmal. Pgm Flaws

The above flaws can be exploited in multiple steps by a concerted attack

Nonmalicious flaws can be exploited to plant malicious flaws (next)

Page 31: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

31© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.3. Malicious Code Malicious code or rogue pgm is written to exploit flaws in pgms

Malicious code can do anything a pgm can Malicious code can change

data other programs

Malicious code has been „oficially” defined by Cohen in 1984 but virus behavior known since at least 1970 Ware’s study for Defense Science Board (classified, made public in 1979)

Outline for this Subsection:3.3.1. General-Purpose Malicious Code (incl.

Viruses)3.3.2. Targeted Malicious Code

Page 32: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

32© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.3.1. General-Purpose Malicious Code (incl. Viruses)

Outlinea. Introductionb. Kinds of Malicious Codec. How Viruses Workd. Virus Signaturese. Preventing Virus Infectionsf. Seven Truths About Virusesg. Case Studies

[cf. B. Endicott-Popovsky]

Page 33: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

33© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

a. Introduction Viruses are prominent example of general-purpose

malicious code Not „targeted” against any user Attacks anybody with a given app/system/config/...

Viruses Many kinds and varieties Benign or harmful Transferred even from trusted sources Also from „trusted” sources that are negligent to

update antiviral programs and check for viruses

[cf. B. Endicott-Popovsky]

Page 34: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

34© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Kinds of Malicious Code (1) [remember Introduction?]

TrapdoorsTrapdoorsTrojan HorsesTrojan Horses

BacteriBacteriaa

Logic BombsLogic BombsWormsWorms

VirusViruseses

XFiles

[cf. Barbara Edicott-Popovsky and Deborah Frincke, CSSE592/492, U. Washington]

Page 35: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

35© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Kinds of Malicious Code (2)

Trojan horse - A computer program that appears to have a useful function, but also has a hidden and potentially malicious function that evades security mechanisms, sometimes by exploiting legitimate authorizations of a system entity that invokes the program

Virus - A hidden, self-replicating section of computer software, usually malicious logic, that propagates by infecting (i.e., inserting a copy of itself into and becoming part of) another program. A virus cannot run by itself; it requires that its host program be run to make the virus active.

Worm - A computer program that can run independently, can propagate a complete working version of itself onto other hosts on a network, and may consume computer resources destructively.

Page 36: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

36© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Kinds of Malicious Code (3)

Bacterium - A specialized form of virus which does not attach to a specific file. Usage obscure.

Logic bomb - Malicious [program] logic that activates when specified conditions are met. Usually intended to cause denial of service or otherwise damage system resources.

Time bomb - activates when specified time occurs Rabbit – A virus or worm that replicates itself without

limit to exhaust resource

Trapdoor / backdoor - A hidden computer flaw known to an intruder, or a hidden computer mechanism (usually software) installed by an intruder, who can activate the trap door to gain access to the computer without being blocked by security services or mechanisms.

Page 37: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

37© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Kinds of Malicious Code (4)

Above terms not always used consistently, esp. in popular press

Combinations of the above kinds even more confusing

E.g., virus can be a time bomb— spreads like virus, „explodes” when time occurs

Term „virus” often used to refer to any kind of malicious code

When discussing malicious code, we’ll often say „virus” for any malicious code

Page 38: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

38© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

c. How Viruses Work (1) Pgm containing virus must be executed to spread virus

or infect other pgms Even one pgm execution suffices to spread virus

widely

Virus actions: spread / infect

Spreading – Example 1: Virus in a pgm on installation CD User activates pgm contaning virus when she runs

INSTALL or SETUP Virus installs itself in any/all executing pgms present

in memory Virus installs itself in pgms on hard disk

From now on virus spreads whenever any of the infected pgms (from memory or hard disk) executes

Page 39: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

39© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (2)

Spreading – Example 2: Virus in attachment to e-mail msg User activates pgm contaning virus (e.g. macro in

MS Word) by just opening the attachment => Disable automatic opening of

attachments!!! Virus installs itself and spreads ... as in Example 1...

Spreading – Example 3: Virus in downloaded file File with pgm or document (.doc, .xls, .ppt, etc.) You know the rest by now...

Document virus Spreads via picture, document, spreadsheet, slide

presentation, database, ... E.g., via .jpg, via MS Office documents .doc, .xls, .ppt, .mdb

Currently most common!

Page 40: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

40© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (3) Kinds of viruses w.r.t. way of attaching to infected

pgms1) Appended viruses

Appends to pgm Most often virus code precedes pgm code

Inserts its code before the 1st pgm instruction in executable pgm file

Executes whenever program executed2) Surrounding viruses

Surronds program Executes before and after infected program

Intercepts its input/output Erases its tracks

The „after” part might be used to mask virus existenceE.g. if surrounds „ls”, the „after” part removes listing of virus file produced by „ls” so user can’t see it

... cont. ...

Page 41: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

41© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (4)... cont. ...

3) Integrating viruses Integrates into pgm code

Spread within infected pgms

4) Replacing viruses Entirely replaces code of infected pgm file

Page 42: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

42© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (5)

(Replacing) virus V gains control over target pgm T by: Overwriting T on hard disk

OR Changing pointer to T with pointer to V (textbook,

Fig. 3-7) OS has File Directory File Directory has an entry that points to file with code for

T Virus replaces pointer to T’s file with pointer to V’s file

In both cases actions of V replace actions of T when user executes what she thinks is „T”

Page 43: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

43© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (6)

Characteristics of a ‘perfect’ virus (goals of virus writers) Hard to detect Not easily destroyed or deactivated Spreads infection widely Can reinfect programs Easy to create Machine and OS independent

Page 44: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

44© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (7)

Virus hiding places1) In bootstrap sector – best place for virus

Bec. virus gains control early in the boot process Before detection tools are active!

2) In memory-resident pgms TSR pgms (TSR = terminate and stay resident) Most frequently used OS pgms or specialized

user pgms=> good place for viruses (activated very often)

...cont...

[Fig. cf. J. Leiwo & textbook]

Before infection:

After infection:

Page 45: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

45© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (8)...cont...

3) In application pgms Best for viruses: apps with macros

(MS Word, MS PowerPoint, MS Excel, MS Access, ...)One macro: startup macro executed when app startsVirus instructions attach to startup macro, infect document files

Bec. doc files can include app macros (commands)

E.g., .doc file include macros for MS WordVia data files infects other startup macros, etc. etc.

4) In libraries Libraries used/shared by many pgms => spread

virus Execution of infected library pgm infects

5) In other widely shared pgms Compilers / loaders / linkers Runtime monitors Runtime debuggers Virus control pgms (!)

Page 46: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

46© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

d. Virus Signatures (1) Virus hides but can’t become invisible – leaves behind a

virus signature, defined by patterns:1) Storage patterns : must be stored

somewhere/somehow (maybe in pieces)

2) Execution patterns: executes in a particular way3) Distribution patterns: spreads in a certain way

Virus scanners use virus signatures to detect viruses (in boot sectior, on hard disk, in memory)

Scanner can use file checksums to detect changes to files

Once scanner finds a virus, it tries to remove it i.e., tries to remove all pieces of a virus V from target pgm T

Virus scanner and its database of virus signatures must be up-to-date to be effective!

Update and run daily!

Page 47: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

47© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (2)

Detecting Virus Signatures (1) Difficulty 1 — in detecting execution patterns:

Most of effects of virus execution (see next page) are „invisible”

Bec. they are normal – any legitimate pgm could cause them (hiding in a crowd)

=> can’t help in detecion

Page 48: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

48© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (3)Detecting Virus Signatures (2)

Virus Goal How AchievedAttach to executable

Modify file directory / Write to executable pgm file

Attach to data/control file

Modify directory / Rewrite dataAppend to data / Append data to self

Remain in memory

Intercept interrupt by modifying interrupt handler address table / Load self in non-transient memory area

Infect disks Intercept interrupt /Intercept OS call (e.g., to format disk)Modify system file / Modify ordinary executable pgm

Conceal self Intercept system calls that would reveal self and falsify results / Classify self as “hidden” file

Spread self Infect boot sector / Infect systems pgmInfect ordinary pgm / Infect data ordinary pgm reads to control its executable

Prevent deactivation

Activate before deactivating pgmand block deactivationStore copy to reinfect after deactivation

[cf. textbook & B. Endicott-Popovsky]

Page 49: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

49© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (4)Detecting Virus Signatures (3)

Difficulty 2 — in finding storage patterns: Polymorphic viruses:

changes from one „form” (storage pattern) to another

Simple virus always recognizable by a certain char pattern

Polymorphic virus mutates into variety of storage patterns

Examples of polymorphic virus mutations Randomly repositions all parts of itself and randomly

changes all fixed data within its code Repositioning is easy since (infected) files stored as chains of data

blocks - chained with pointers

Randomly intersperses harmless instructions throughout its code (e.g., add 0, jump to next instruction)

Encrypting virus: Encrypts its object code (each time with a different/random key), decrypts code to run ... More below ...

Page 50: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

50© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (5)Detecting Virus Signatures (4)

Encrypting virus structure (informal pseudo-code)array decr_key;procedure decrypt(virus_code, decr_key)

...end /* decrypt */

begin /* virus V in target pgm T */decrypt (V, decr_key);

infect: if infect_condition met thenfind new target pgms NT to infect;mutate V into V’ for copying;encrypt V’ with random key into V”;save new key in file for V”;attach V” to NT;hide modification of NT (with

stealthcode of V);

damage: if damage_condition met thenexecute damage_code of V

else start Tend /* virus V in target pgm T */

sto-reden-cryp-ted

Page 51: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

51© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (6)Detecting Virus Signatures (5)

Encrypting virus: Encrypts its object code (each time with a different/random key), decrypts code to run

Q: Is there any signature for encryption virus that a scanner can see?

Hint: consider 3 parts of encryption virus: „proper” virus code (infect/damage code) decr_key procedure decrypt

Page 52: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

52© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (7)Detecting Virus Signatures (6)

... Q: Q: Is there any signature for encryption virus

that a scanner can see? A: Lets’ see:

„proper” virus code – encrypted with random key – polymorphic

decr_key – random key used to encrypt/decrypt – polymorphic

procedure decrypt (or a pointer to a library decrypt procedure) – unencrypted, static=> procedure decrypt of V is its signature

visible to a scanner But: Virus writer can use polymorphic techniques on

decryption code to make it „less visible” (to hide it) Virus writers and scanner writers challenge each other

An endless game?

Page 53: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

53© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

e. Preventing Virus Infections Preventing Virus Infections

Use commercial software fromtrustworthy sources

But even this is not an absoluteguarantee of virus-free code!

Test new software on isolated computers Open only safe attachments Keep recoverable system image in safe place Backup executable system files Use virus scanners often (daily) Update virus detectors daily

Databases of virus signatures change very often

No absolute guarantees even if you follow all the rules – just much better chances of preventing a virus

[cf. B. Endicott-Popovsky]

Page 54: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

54© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

f. Seven Truths About Viruses Viruses can infect any platform Viruses can modify “hidden” / “read only” files Viruses can appear anywhere in system Viruses spread anywhere sharing occurs Viruses cannot remain in memory aftera complete

power off/power on on reboot But virus reappears if saved on disk (e.g., in the boot sector)

Viruses infect software that runs hardware There are firmware viruses (if firmware writeable by s/w)

Viruses can be malevolent, benign, or benevolent Hmmm...

Would you like a benevolent virus doing good things (like compressing pgms to save storage) but without your knowledge?

[cf. B. Endicott-Popovsky]

Page 55: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

55© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

g. Case Studies (1) The Internet Worm

Attacked on 11/2/1988 Invaded VAX and Sun-3 computers running versions

of Berkeley UNIX Used their resources to attack still more computers Within hours spread across the U.S Infected hundreds / thousands of computers – serious

damage to Internet Some uninfected networks were scared into disconnecting

from Internet => severed connections stopped necessary work

Made many computers unusable via resource exhaustion

Was a rabbit – supposedly by mistake unintended by its writer

Perpetrator was convicted in 1990 ($10,000 fine + 400 hrs of community service + 3-year suspended jail sentence)

Caused forming Computer Emergency Response Team (CERT) at CMU

[cf. textbook & B. Endicott-Popovsky]

Page 56: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

56© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Case Studies (2)

Other case studies [textbook – interesting reading] The Brain (Pakistani) Virus (1986) Code Red (2001)

Denial-of-service (DoS) attack on www.whitehouse.gov Web Bugs (generic potentially malicious code on

web pages) Placing a cookie on your hard drive Cookie collects statistics on user’s surfing habits Can be used to get your IP address, which can then be

used to target you for attack Block cookies or delete cookies periodically (e.g., using

browser command; in MS IE: Tools>Internet Options-General:Delete Cookies)

Tool: Bugnosis from Privacy Foundation – locates web bugs

Page 57: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

57© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

h. Virus Removal andSystem Recovery After Infection

Fixing a system after infection by virus V:1) Disinfect (remove) viruses (using antivirus pgm)

Can often remove V from infected file for T w/o damaging T

if V code can be separated from T code and V did not corrupt T

Have to delete T if can’t separate V from T code

2) Recover files:- deleted by V- modified by V- deleted during disinfection (by antivirus pgm)

=> need file backups! Make sure to have backups of (at least)

important files

Page 58: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

58© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.3.2. Targeted Malicious Code

Targeted = written to attack a particular system, a particular application, and for a particular purpose

Many virus techniques applySome new techniques as well

Outline:a. Trapdoorsb. Salami attackc. Covert channels

Page 59: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

59© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

a. Trapdoors (1) Original def:

Trapdoor / backdoor - A hidden computer flaw known to an intruder, or a hidden computer mechanism (usually software) installed by an intruder, who can activate the trap door to gain access to the computer without being blocked by security services or mechanisms.

A broader definition:Trapdoor – an undocumented entry point to a module

Inserted during code development For testing As a hook for future extensions As emergency access in case of s/w failure

Page 60: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

60© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Trapdoors (2)

Testing: With stubs and drivers for unit testing (Fig. 3-10 p.

138) Testing with debugging code inserted into tested

modules May allow programmer to modify internal module variables

Major sources of trapdoors: Left-over (purposely or not) stubs, drivers, debugging

code Poor error checking

E.g., allowing for unacceptable input that causes buffer overflow

Undefined opcodes in h/w processors Some were used for testing, some random

Not all trapdoors are bad Some left purposely w/ good intentions

— facilitate system maintenance/audit/testing

Page 61: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

61© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Salami attack Salami attack - merges bits of seemingly

inconsequential data to yield powerful results Old example: interest calculation in a bank:

Fractions of 1 ¢ „shaved off” n accounts and deposited in attacker’s account

Nobody notices/cares if 0.1 ¢ vanishes Can accumulate to a large sum

Easy target for salami attacks: Computer computations combining large numbers with small numbers

Require rounding and truncation of numbers Relatively small amounts of error from these op’s

are accepted as unavoidable – not checked unless a strong suspicion

Attacker can hide „salami slices” within the error margin

Page 62: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

62© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

c. Covert Channels (CC) (1) Outline:

i. Covert Channels - Definition and Examplesii. Types of Covert Channelsiii. Storage Covert Channelsiv. Timing Covert Channelsv. Identifying Potential Covert Channelsvi. Covert Channels - Conclusions

Page 63: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

63© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

i. CC – Definition and Examples (1) So far: we looked at malicious pgms that perform wrong

actions Now: pgms that disclose confidential/secret info

They violate confidentiality, secrecy, or privacy of info

Covert channels = channels of unwelcome disclosure of info

Extract/leak data clandestinely

Examples1) An old military radio communication network

The busiest node is most probably the command center Nobody is so naive nowadays

2) Secret ways spies recognize each other Holding a certain magazine in hand Exchanging a secret gesture when approaching each other ...

Page 64: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

64© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Covert Channels – Definition and Examples (2) How programmers create covert channels?

Providing pgm with built-in Trojan horse Uses covert channel to communicate extracted data

Example: pgm w/ Trojan horse using covert channel Should be:

Protected LegitimateData <------[ Service Pgm ]------> User

Is:Protected LegitimateData <------[ Service Pgm ]------> User

[ w/ Trojan h. ]

covert channel

Spy (Spy - e.g., programmer who put Trojan into pgm; directly or via Spy Pgm)

Page 65: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

65© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Covert Channels – Definition and Examples (3)

How covert channels are created?I.e., How leaked data are hidden?

Example: leaked data hidden in output reports (or displays)

Different ‘marks’ in the report: (cf. Fig. 3-12, p.143)

Varying report format Changing line length / changing nr of lines per page Printing or not certain values, characters, or headings

- each ‘mark’ can convey one bit of info

Page 66: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

66© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Covert Channels – Definition and Examples (4) Example – ctd.

How Trojan within pgm can leak a 4-bit value of a protected variable X?

cf. Fig. 3-12, p.143

Trojan signals value of X as follows: Bit-1 = 1 if >1 space follows ‘ACCOUNT CODE:’; 0

otherwise Bit-2 = 1 if last digit in ‘seconds’ field is >5; 0 otherwise Bit-3 = 1 if heading uses ‘TOTALS’; 0 otherwise (uses

‘TOTAL’) Bit-4 = 1 if no space follows subtotals line; 0 otherwise

=> For the values as in this Fig, Trojan signaled and spy got: X = ‘1101’

Page 67: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

67© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

ii. Types of Covert Channels Types of covert channels

Storage covert channels Convey info by presence or absence of an

object in storage

Timing covert channels Convey info by varying the speed at which

things happen

Page 68: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

68© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

iii. Storage Channels (1) Example of storage channel: file lock covert channel

Protected variable X has n bits: X1, ..., Xn Trojan within Service Pgm leaks value of X Trojan and Spy Pgm synchronized, so can „slice”

time into n intervals File FX (not used by anybody else) To signal that Xk=1, Trojan locks file FX for interval

k (1≤ k ≤ n)To signal that Xk=0, Trojan unlocks file FX for interval k

Spy Pgm tries to lock FX during each interval If it succeds during k-th interval, Xk = 0 (FX was unlocked)Otherwise, Xk = 1 (FX was locked)

(see Fig. 3-13, 3-14 – p.144-145)

Q: Why FX should not be used by anybody else?

Page 69: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

69© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Storage Channels (2)

Example of storage channel: file lock covert channel ...

Q: Why FX should not be used by anybody else? A: Any other user lockin/unlocking FX would

interfere with Trojan’s covert channel signaling.

Isn’t such bit-by-bit signaling too slow?No – bec. computers are very fast!

E.g., 10-100 bits/millisecond (10K – 100K b/s) is very slow for computersIt still can leak entire P&P textbook in just minutes

Page 70: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

70© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Storage Channels (3)

Examples of covert storage channels (synchronized intervals!)

Covert channels can use: File locks (discussed above)

Disk storage quota To signal Xk=1, Trojan create enormous file

(consuming most of available disk space) Spy Pgm attempts to create enormous file. If Spy

fails (bec. no disk space available), Xk = 1; otherwise, Xk = 0

Existence of a file To signal Xk=1, Trojan creates file FX (even empty

file) Spy Pgm atempts to create file named FX. If Spy

fails (bec. FX already exists), Xk = 1; otherwise, Xk = 0

Other resources - similarly

Page 71: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

71© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Storage Channels (4)

Covert storage channels require: Shared resource

To indicate Xk=1 or Xk=0 Synchronized time

To know which bit is signaled:in interval k, Xk is signaled

Page 72: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

72© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

iv. Timing Channels Recall: Timing channels convey info by varying the

speed at which things happen

Simple example of timing channel: Multiprogramming system „slices” processor time

for programs running on the processor 2 processes only: Trojan (Pgm w/ Trojan) and Spy

Pgm Trojan receives all odd slices (unless abstains)

Spy Pgm receives all even slices (unless abstains) Trojan signals Xk=1 by using its time slice,

signals Xk=0 by abstaining from using its slice see: Fig.3-15, p.147 – how ‘101’ is signaledDetails: Trojan takes Slice 1 (its 1st slice) signaling X1=1

Trojan abstains from taking Slice 3 (its 2nd slice) signaling X2=0 Trojan takes Slice 5 (its 3rd slice) signaling X3=1

Page 73: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

73© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

v. Identifying Potential Covert Channels (1)

Covert channels are not easy to identify Otherwise wouldn’t be covert, right?

Two techniques for locating covert channels:1) Shared Resource Matrix2) Information Flow Method

Page 74: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

74© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (2)

1) The Shared Resource Matrix method Shared resource is basis for a covert channel

=> identify shared resources and processes reading/writing them

Step 1: Construct Shared Resource MatrixRows — resourcesColumns — processes that access them:

R = observe resource M = modify/set/create/delete resource

Example

Process 1 Process 2

Lock on FX R, M R, M

X (confid.) R

Page 75: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

75© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (3) ...

Step 2: Look for pattern:

Meaning of this pattern:Process Pj can get value ofResource Rn via Process Pi(and a covert channel)

Q: Do you see such a pattern in SRM above?

Pi Pj

Rm

M R

Rn R

Pgm 1 Pgm 2

Lock on FX R, M R, M

X (confid.) R

Page 76: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

76© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (4) ...

Step 2: Look for pattern:

Meaning of this pattern:Process Pj can get value ofResource Rn via Process Pi(and a covert channel)

Q: Do you see such a pattern in SRM above? A: Yes. Process 2 can get value of X via Process 1

(no surprise: Proc. 1 & 2 are Trojan & Spy from earlier example)

i j

m M R

n R

Process 1 Process 2

Lock on FX R, M R, M

X (confid.) R

Page 77: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

77© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (5)

2) Information Flow Method Flow analysis of pgm’s syntax

Can be automated within a compiler Identifies non-obvious flows of info between pgm

statements

Examples of flows of info between pgm stmts B:= A – an explicit flow from A to B B:= A; C:=B – an explicit flow from A to C (via B) IF C=1 THEN B:=A

– an explicit flow from A to B– an implicit flow from C to B (bec. B can change iff C=1)

Page 78: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

78© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (6)

More examples of flows of info between pgm stmts

[textbook and J. Leiwo]

Page 79: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

79© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Identifying Potential Covert Channels (7)

Steps of Information Flow Method (IFM)1) Analyze statements2) Integrate results to see which outputs affected by

which inputs

Variants of IFM:1) IFM during compilation2) IFM on design specs

Page 80: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

80© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Covert Channels - Conclusions Covert channels are a serious threat to

confidentiality and thus security („CIA” = security)

Any virus/Trojan horse can create a covert channel

In open systems — no way to prevent covert channels

Very high security systems require a painstaking and costly design preventing (some) covert channels

Analysis must be performed periodically as high security system evolves

Page 81: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

81© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

3.4. Controls for Security How to control security of pgms during their

development and maintenance

Outline:a. Introductionb. Developmental controls for securityc. Operating system controls for securityd. Administrative controls for securitye. Conclusions

Page 82: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

82© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

a. Introduction „Better to prevent than to cure”

Preventing security flaws We have seen a lot of possible security flaws How to prevent (some of) them? Software engineering concentrates on developing

and maintaining quality s/w We’ll take a look at some techniques useful

specifically for developing/ maintaining secure s/w

Three types of controls for security (against pgm flaws):1) Developmental controls2) OS controls3) Administrative controls

Page 83: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

83© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

b. Developmental Controls for Security (1)

Nature of s/w development Collaborative effort Team of developers, each involved in 1 of stages:

Requirement specification Regular req. specs: „do X” Security req. specs: „do X and nothing more”

Design Implementation Testing Documenting at each stage Reviewing at each stage Managing system development thru all stages Maintaining deployed system (updates, patches, new

versions, etc.)

Both product and process contribute to overall quality — incl. security dimension of quality

Page 84: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

84© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (2) Fundamental principles of s/w engineering

1) Modularity2) Encapsulation3) Info hiding

1) Modularity Modules should be:

Single-purpose - logically/functionally Small - for a human to grasp Simple - for a human to grasp Independent – high cohesion, low coupling

High cohesion – highly focused on (single) purpose Low coupling – free from interference from other

modules Modularity should improve correctness

Fewer flaws => better security

Page 85: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

85© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (3)

2) Encapsulation Minimizing info sharing with other modules

=> Limited interfaces reduce # of covert channels Well documented interfaces „Hiding what should be hidden and showing what

should be visible.”

3) Information hiding Module is a black box

Well defined function and I/O Easy to know what module does but not how it

does it Reduces complexity, interactions, covert

channels, ...=> better security

Page 86: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

86© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (4)

Techniques for building solid software 1) Peer reviews2) Hazard analysis3) Testing4) Good design5) Risk prediction & mangement6) Static analysis7) Configuration management8) Additional developmental controls

... Please read on your own .....Also see slides—all discussed below ...

[cf. B. Endicott-Popovsky]

Page 87: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

87© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (5)

1) Peer reviews - three types Reviews

Informal Team of reviewers Gain consensus on solutions

before development Walk-throughs

Developer walks team through code/document Discover flaws in a single design document

Inspection Formalized and detailed Statistical measures used

Various types of peer reviews can be highly effective

[cf. B. Endicott-Popovsky]

Page 88: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

88© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (6)

2) Hazard analysis= systematic techniques to expose

potentially hazardous system states,incl. security vulnerabilities

Components of HA Hazard lists What-if scenarios – identifies non-obvious hazards System-wide view (not just code) Begins Day 1 Continues throughout SDLC (= s/w dev’t life

cycle)

Techniques HAZOP – hazard and operability studies FMEA – failure modees and effects analysis FTA – fault tree analysis [cf. B. Endicott-Popovsky]

Page 89: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

89© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (7)

3) Testing – phases: Module/component/unit testing of indiv. modules Integration testing of interacting (sub)system modules (System) function testing checking against requirement

specs (System) performance testing (System) acceptance testing – with customer against

customer’s requirements — on seller’s or customer’s premises (System) installation testing after installation on

customer’s system Regression testing after updates/changes to s/w

Types of testing Black Box testing – testers can’t examine code White Box / Clear box testing – testers can examine

design and code, can see inside modules/system

Page 90: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

90© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (8)

4) Good design Good design uses:

i. Modularity / encapsulation / info hidingii. Fault toleranceiii. Consistent failure handling policiesiv. Design rationale and historyv. Design patterns

i. Using modularity / encapsulation / info hiding - as discussed above

Page 91: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

91© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (9)

4) Good design – cont.1a

ii. Using fault tolerance for reliability and security System tolerates component failures System more reliable than any of its components

Different than for security, where system is as secure as its weakest component

Fault-tolerant approach: Anticipate faults (car: anticipate having a flat tire)

Active fault detection rather than pasive fault detection (e.g., by use of mutual suspicion: active input data checking)

Use redundancy (car: have a spare tire)

Isolate damage Minimize disruption (car: replace flat tire, continue your

trip)

[cf. B. Endicott-Popovsky]

Page 92: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

92© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (10)4) Good design – cont.1b

Example 1: Majority voting (using h/w redundancy) 3 processor running the same s/w

E.g., in a spaceship Result accepted if results of 2 processors agree

Example 2: Recovery Block (using s/w redundancy)

Primary Codee.g., Quick Sort

Secondary Code

e.g., Bubble Sort

Acceptance Test

Quick Sort – – new code (faster)Bubble Sort –– well-tested code

Page 93: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

93© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (11)

4) Good design – cont.2

iii. Using consistent failure handling policies Each failure handled by one of 3 ways:

Retrying Restore previous state, redo service using different

„path” E.g., use secondary code instead of primary code

Correcting Restore previous state, correct sth, run service using

the same code as before Reporting

Restore previous state, report failure to error handler, don’t rerun service

Example — How fault-tolerance enhances security If security fault destroys important data (availability in

CIA), use f-t to revert to backup data set

Page 94: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

94© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (12)

4) Good design – cont.3

iv. Using design rationale and history Knowing it (incl. knowing design rationale and

history for security mechanisms) helps developers modifying or maintaining system

v. Using design patterns Knowing it enables looking for patterns showing

what works best in which situation

Page 95: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

95© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (13)

Value of Good Design Easy maintenance Understandability Reuse Correctness Better testing

=> translates into (saving) BIG bucks !

[cf. B. Endicott-Popovsky]

Page 96: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

96© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (14)

5) Risk prediction & management Predict and manage risks involved in system

development and deployment Make plans to handle unwelcome events should

they occur Risk prediction/mgmt are esp. important for

security Bec. unwelcome and rare events can have

security consequences Risk prediction/mgmt helps to select proper

security controls (e.g., proportional to risk)

Page 97: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

97© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (15)

6) Static analysis Before system is up and running, examine its

design and code to locate security flaws More than peer review

Examines Control flow structure (sequence in which instructions

are executed, incl. iterations and loops) Data flow structure (trail of data) Data structures

Automated tools available

[cf. B. Endicott-Popovsky]

Page 98: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

98© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (16)

7) Configuration management = process of controling system modifications during

development and maintenance Offers security benefits by scrutinizing

new/changed code

Problems with system modifications One change interefering with other change

E.g., neutralizing it Proliferation of different versions and releases

Older and newer For different platforms For different application environments (and/or customers

categories)

Page 99: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

99© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (17)

Reasons for software modification Corrective changes

To maintain control of system’s day-to-day functions

Adaptive changes To maintain control over system’s modifications

Perfective changes To perfect existing acceptable system functions

Preventive changes To prevent system’s performance degradation to

unacceptable levels

Page 100: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

100© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (18)

Activities involved in configuration management process (performed by reps from developers, customers, users, etc.)

1) Baseline identification Certain release/version (R/v) selected & frozen as

baseline Other R’s/v’s described as changes to the baseline

2) Configuration control and configuration management Coordinate separate but related v’s (versions) via:

Separate files - separate files for each R or v Deltas - main v defined by „full files”

- other v’s defined by main v & deltas(= difference files)

Conditional compilation - single source code file F for all v’s

uses begin_version_Vx / end_version_Vx brackets or begin_not_version_Vx / end_not_version_Vx brackets

- compiler produces each v from F

Page 101: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

101© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (19)

3) Configuration auditing System must be audited regularly — to verify:

Baseline completeness and accuracy Recording of changes Accuracy of software documentation for

systems in the field Peformed by independent parties

4) Status accounting Records info about system components

Where they come from (purchased, reused, written from scratch)

Version Change history Pending change requests

Page 102: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

102© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (20)

All 4 activities performed by Configuration Control Board (CCB)

Includes reps from developers, customers, users Reviews proposed changes, approves/rejects

Security benefits of configuration mgmt Limits unintentional flaws Limits malicious modifications

by protecting integrity of pgms and documentation Thanks to:

careful reviewing/auditing, change mgmt preventing changes (e.g., trapdoors) to system w/o

acceptance by CCB

Page 103: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

103© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (21)

8) Additional developmental controls8a) Learning from mistakes

Avoiding such mistakes in the future enhances security

8b) Proofs of program correctness Formal methods to verify pgm correctness Logic analyzer shows that:

initial assertions about inputs...... through implications of pgm statements...... lead to the terminal condition (desired output)

Problems with practical use of pgm correctness proofs

Esp. for large pgms/systems Most successful for specific types of apps

E.g. for communication protocols & security policies

Even with all these developmental controls (1-8) – still no security guarantees! [cf. B. Endicott-Popovsky]

Page 104: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

104© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

c. Operating System Controls for Security (1)

Developmental controls not always usedOR: Even if used, not foolproof=> Need other, complementary controls, incl. OS

controls

Such OS controls can protect against some pgm flaws

Page 105: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

105© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (2)

Trusted software – code rigorously developed an analyzed so we can trust that it does all and only what specs say Trusted code establishes foundation upon which

untrusted code runs Trusted code establishes security baseline for

the whole system In particular, OS can be trusted s/w

Page 106: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

106© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (3)

Key characteristics determining if OS code is trusted1) Functional correctness

OS code consistent with specs2) Enforcement of integrity

OS keeps integrity of its data and other resources even if presented with flawed or unauthorized commands

3) Limited privileges OS minimizes access to secure data/resources Trusted pgms must have „need to access” and

proper access rights to use resources protected by OS

Untrusted pgms can’t access resources protected by OS

4) Appropriate confidence level OS code examined and rated at appropriate trust

level

Page 107: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

107© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (4)

Similar criteria used to establish if s/w other than OS can be trusted

Ways of increasing security if untrusted pgms present:

1) Mutual suspicion2) Confinement3) Access log

1) Mutual suspicion between programs Distrust other pgms – treat them as if they were

incorrect or malicious Pgm protects its interface data

With data checks, etc.

Page 108: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

108© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (5)

2) Confinement OS can confine access to resources by suspected

pgm Example 1: strict compartmentalization

Pgm can affect data and other pgms only within its compartment

Example 2: sandbox for untrusted pgms

Can limit spread of viruses

Page 109: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

109© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (6)

3) Audit log / access log Records who/when/how (e.g., for how long)

accessed/used which objects Events logged: logins/logouts, file accesses,

pgm ecxecutions, device uses, failures, repeated unsuccessful commands (e.g., many repeated failed login attempts can indicate an attack)

Audit frequently for unusual events, suspicious patterns

Forensic measure not protective measure Forensics – investigation to find who broke law,

policies, or rules

...Much more on OS controls soon...

Page 110: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

110© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

d. Administrative Controls for Security (1)

They prohibit or demand certain human behavior via policies, procedures, etc.

They include:1) Standards of program development2) Security audits3) Separation of duties

Page 111: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

111© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Administrative Controls for Security (2)

1) Standards and guidelines for program development Capture experience and wisdom from previous

projects Facilitate building higher-quality s/w (incl. more secure) They include:

Design S&G – design tools, languages, methodologies S&G for documentation, language, and coding

style Programming S&G - incl. reviews, audits Testing S&G Configuration mgmt S&G

2) Security audits Check compliance with S&G Scare potential dishonest programmer from including

illegitimate code (e.g., a trapdoor)

Page 112: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

112© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

Administrative Controls for Security (3)

3) Separation of duties Break sensitive tasks into 2 pieces to be

performed by different people (learned from banks) Example 1: modularity

Different developers for cooperating modules Example 2: independent testers

Rather than developer testing her own code

...More (much) later...

Page 113: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

113© 2006 by Leszek T. LilienSection 3 – Computer Security and Information Assurance – Spring 2006

e. Conclusions (for Controls for Security)

Developmental / OS / administrative controls help produce/maintain higher-quality (also more secure) s/w

Art and science - no „silver bullet” solutions „A good developer who truly understands security

will incorporate security into all phases of development.”

[textbook, p. 172]

Summary:Control Purpose Benefit

Develop-

mental

Limit mistakesMake malicious code difficult

Produce better software

OperatingSystem

Limit access to system Promotes safe sharing of info

Adminis-trative

Limit actions of people Improve usability, reusability and maintainability

[cf. B. Endicott-Popovsky]

Page 114: CS 5950/6030 – Computer Security and Information Assurance Section 3: Program Security Dr. Leszek Lilien Department of Computer Science Western Michigan.

End of: Section 3: Program

Security