Big-Oh Notation. Agenda What is Big-Oh Notation? Example Guidelines Theorems.

Post on 18-Jan-2016

218 views 1 download

Transcript of Big-Oh Notation. Agenda What is Big-Oh Notation? Example Guidelines Theorems.

Big-Oh Notation

Agenda

What is Big-Oh Notation? Example Guidelines Theorems

Big-Oh Notation (O)

f(x) is O(g(x))iff there exists constants ‘c’and ‘k’ such that f(x)<=c.g(x) where x>k

Pronounced as f(x) is Big-Oh of g(x)

This gives the upper bound value of a function

Example

f(n) = 10n + 5 and g(n) = n To show f(n) is O(g(n))

we must show constants c and k such that f(n) <= cg(n) for all n >=k or 10n+5 <= cn for all n >= k

We are allowed to choose c and k to be integers we want as long as they are positive.

Contd.. They can be as big as we want, but they

can't be functions of n. Try c = 15. Then we need to show: 10n + 5 <= 15n. Solving for n we get: 5 <= 5n or 1 <= n. So f(n) = 10+5 <= 15g(n) for all n >= 1.

(c = 15, k = 1). Therefore we have shown f(n) is O(g(n)).

How do we calculate big-O?

1 Loops2 Nested loops3 Consecutive statements4 If-then-else statements5 Logarithmic complexity

6

Five guidelines for finding out the time complexity of a piece of code

Guideline 1: Loops

7

The running time of a loop is, at most, the running time of the statements inside the loop (including tests) multiplied by the number of iterations.

for (i=1; i<=n; i++){ m = m + 2;}

constant timeexecutedn times

Total time = a constant c * n = cn = O(N)

Guideline 2: Nested loops

8

Analyse inside out. Total running time is the product of the sizes of all the loops.

for (i=1; i<=n; i++) { for (j=1; j<=n; j++) { k = k+1; }} constant time

outer loopexecutedn times

inner loopexecutedn times

Total time = c * n * n * = cn2 = O(N2)

Guideline 3: Consecutive statements

9

Add the time complexities of each statement.

Total time = c0 + c1n + c2n2 = O(N2)

x = x +1;for (i=1; i<=n; i++) { m = m + 2;}for (i=1; i<=n; i++) { for (j=1; j<=n; j++) { k = k+1; }}

inner loopexecutedn times

outer loopexecutedn times constant time

executedn times

constant time

constant time

Guideline 4: If-then-else statements

10

Worst-case running time: the test, plus either the then part or the else part (whichever is the larger).

if (depth( ) != otherStack.depth( ) ) { return false;}else { for (int n = 0; n < depth( ); n++) { if (!list[n].equals(otherStack.list[n])) return false; }}

then part:constant

else part:(constant +constant) * n

test:constant

another if :constant + constant(no else part)

Total time = c0 + c1 + (c2 + c3) * n = O(N)

Guideline 5: Logarithmic complexity

11

An algorithm is O(log N) if it takes a constant time to cut the problem size by a fraction (usually by ½)

Example algorithm (binary search):finding a word in a dictionary of n pages

• Look at the centre point in the dictionary• Is word to left or right of centre?• Repeat process with left or right part of dictionary until the word is found

Theorems Let d(n),e(n),f(n) and g(n) be functions

mapping non negative integers real. Then

If d(n) is O(f(n)),then ad(n) is O(f(n)),for any constant a>0

If d(n) is O(f(n)) and e(n) is O(g(n)),then d(n)+e(n) is O(f(n)+g(n))

If d(n) is O(f(n)) and e(n) is O(g(n)),then d(n).e(n) is O(f(n) g(n))

If d(n) is O(f(n)) and f(n) is O(g(n)),then d(n) is O(g(n)).

Contd.. If f(n) is a polynomial of degree d

i.e, f(n)=(a0+a1n+…+ad nd)

then f(n) is O(nd).

n x is O(a n) for any fixed x>0 and a>1

log n x is O(log n) for any fixed x>0

log x n is O(n y) for any fixed constants x>0 and y>0

Example 2n3+4n2logn is O(n3) Proof: log n is O(n) – Rule 8 4n2logn is O(4n3) – Rule 3 2n3+4n2log n is O(2n3+4 n3) – Rule 2 2n3+4n2log n is O(n3) – Rule 1 2n3+4n2log n is O(n3) – Rule 4Hence,Proved.

Relatives of Big-Oh

big-Omega f(n) is Ω(g(n)) if there is a constant c > 0

and an integer constant n0 ≥ 1 such that f(n) ≥ c*g(n) for n ≥ n0

big-Theta f(n) is θ(g(n)) if there are constants c ’ > 0

and c’’ > 0 and an integer constant n0 ≥ 1 such that c’•g(n) > f(n) > c’’•g(n) for n ≥ n0

Contd… little-oh f(n) is o(g(n)) if, for any constant c > 0,

there is an integer constant n0 ≥ 0 such that f(n) ≤ c•g(n) for n ≥ n0

little-omega f(n) is w(g(n)) if, for any constant c > 0,

there is an integer constant n ≥ 0 such that f(n) ≥c•g(n) for n ≥n0

Intuition for Asymptotic Notation Big-Oh : f(n) is O(g(n)) if f(n) is asymptotically

less than or equal to g(n) big-Omega : f(n) is W(g(n)) if f(n) is

asymptotically greater than or equal to g(n) big-Theta :f(n) is Q(g(n)) if f(n) is asymptotically

equal to g(n) little-oh : f(n) is o(g(n)) if f(n) is asymptotically

strictly less than g(n) little-omega: f(n) is w(g(n)) if is asymptotically

strictly greater than g(n)

Bubble Sort – O(n)2

main(){ int a[6]={7,5,3,4,2,1}; int i=0,j=0; int n=6; for(j=n-1;j>0;j--) { for(i=0;i<j;i++) { if(a[i]>a[i+1]) { int temp=a[i]; //swap a[i]=a[i+1]; a[i+1]=temp; } } }}

Assignment Find the complexity of the following Algorithm in

Big-Oh – Binary Search for (i=1;i<10 ;i++ ) {

for (j=0;j<i ;j++ ){

if (arr[j]>arr[i]){

temp=arr[j];arr[j]=arr[i];for (k=i;k>j ;k-- )

arr[k]=arr[k-1];arr[k+1]=temp;

} }}

References Fundamentals of Computer Algorithms Ellis Horowitz,Sartaj Sahni,Sanguthevar

Rajasekaran Algorithm Design Micheal T. GoodRich,Robert Tamassia Analysis of Algorithms Jeffrey J. McConnell

THANK YOU