Http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 1 Algorithm Analysis...
-
Upload
elijah-simmons -
Category
Documents
-
view
213 -
download
0
Transcript of Http://dcst2.east.asu.edu/~razdan/cst230/ Razdan with contribution from others 1 Algorithm Analysis...
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
1
Algorithm AnalysisWhat is the Big ‘O Bout?
Anshuman RazdanDiv of Computing Studies
[email protected]://dcst2.east.asu.edu/~razdan/cst230/
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
2
Algorithm
• An Algorithm is a procedure or sequence of instructions for solving a problem. Any algorithm may be expressed in many different ways: In English, In a particular programming language, or (most commonly) in a mixture of English and programming called pseudocode.
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
3
Example
• For each itemdo something if something greater than nothing
return trueelse
return false
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
4
Analyzing Algorithms Questions: Is this algorithm fast enough?
How much memory will I need to implement this solution? Is there a better solution?
Analysis is useful BEFORE any implementation is done to avoid wasting time on an inappropriate solution.
Algorithm analysis involves predicting resources (time, space) required
Analyze algorithms independent of machines, implementation, etc.
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
5
Run-time Analysis
Number of operations (amount of time) required for the algorithm to terminate
The number of operations typically depends on the program’s input size. E.g., sorting 1000 numbers takes longer (performs more operations) than sorting 3 numbers.
running time = function(input size) Input size typically expressed as “N” or “n”
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
6
Space Analysis
• Amount of storage (memory) required for the program to execute correctly.
• Again, the amount of storage will often depend on the input size.
• Space complexity = function(input size)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
7
Example – time complexityInsertion-Sort(A)1 for j := 2 to length[A]2 do key := A[j]3 /* insert A[j] into sorted A[1..j-1] */4 i := j -15 while i > 0 and A[i] > key6 do A[i+1] := A[i]7 i := i - 18 A[i+1] := key
Count number of operations for inputs:A = [3 5 2 8 3] A = [2 3 3 5 8]n(4 + 3(n-1)/2) = (3n^2 + 5n)/2 where n is the size of AO(n^2)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
8
Algorithm Complexity
• Worst case running time is an upper bound on running time for any input of size n
• Best case is often not very useful
• Average case: average over all possible inputs. Often same complexity as worst case.
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
9
Stair counting problem
• Technique 1 – Keep a tally = 3n ops
• Technique 2 – Let someone else keep count = n+2(1+2+…+n)
• Technique 3 – Ask = 4 ops
• 2689 steps
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
10
Num Ops (Stair Counting)
• T1 : 3n
• T2 : n2 + 2n
• T3 : log10n + 1 where n is rounded down
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
11
Stair Counting
• 2689 stairs
• T1 : 8067 operations
• T2 : 7,236,099 operations
• T3 : 4 operations
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
12
Number of Ops in Three Techniques
Stairs O10(log n) O(n) O(n2)
10 2 30 120
100 3 300 10,200
1000 4 3000 1,002,000
10000 5 30000 100,020,000
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
13
Factoid
• Order of the algorithm is generally more important then the speed of the processor
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
14
Search Algorithm
• for (i=0); i < data.length; i++)if data[i] == target
return true;
What is the?
Best Case
Worst Case
Average
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
15
Worst Case
• Usually we treat only the worst case performance: The worst case occurs fairly often, example: looking for in entry in a database that is not present.
• The result of the worst case analysis is often not different to the average case analysis (same order of complexity).
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
16
Asymptotic Notation• If we want to treat large problems (these are the critical
ones), we are interested in the asymptotic behavior of the growth of the running time function. Thus, when comparing the running times of two algorithms: Constant factors can be ignored. Lower order terms are unimportant when the higher order terms are different.
• For instance, when we analyze selection sort, we find that it takes T(n) = n2 + 3n - 4 array accesses. For large values of n, the 3n - 4 part is insignificant compared to the n2 part.
• An algorithm that takes a time of 100n2 will still be faster than an algorithm that is n3 for any value of n larger than 100.
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
17
Big-O Definition O(g(n)) = {f(n) : there exist positive constants c and
n0, such that 0 f(n) c g(n) for all n n0} |f(n)| c|g(n)| (use this to prove big-O).
n0
c g(n)
f(n)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
18
Big O Naming Convention• If a function is O(log(n)) we call it
logarithmic. • If a function is O(n) we call it linear. • If a function is O(n2) we call it quadratic. • If a function is O(nk) with k≥1, we call it
polynomial. • If a function is O(an) with a>1, we call it
exponential.
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
19
Asymptotic Notation , O, , o,
-- tight bound– When a problem is solved in theoretical min time
• O -- upper bound -- lower bound• o -- non-tight upper bound -- non-tight lower bound
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
20
Big Definition (g(n)) = {f(n) : there exist positive constants c
and n0, such that 0 c g(n) f(n) for all n n0}
n0
c g(n)
f(n)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
21
Definition( g(n) ) = { f(n) : there exist positive constants c1,
c2, and n0, such that 0 c1 g(n) f(n) c2 g(n) for all n n0 }
n0
c1 g(n)
c2 g(n)
f(n)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
22
Theorem• For any two functions f(n) and g(n), we have f(n)
= ( g(n) ) iff f(n) = O(g(n) ) and f(n) = (g(n))
n0
c g(n)
f(n)
n0
c g(n)
f(n)
n0
c1 g(n)
c2 g(n)
f(n)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
23
• If lim n-> inf f(n)/g(n) = 0, f(n) = O(g(n))
• If lim n-> inf f(n)/g(n) = inf, f(n) = (g(n))
• If lin n-> inf f(n)/g(n) = c, f(n) = (g(n))
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
24
o and Definitions
• f(n) = o( g(n) ) if g(n) grows lot faster than f(n)
• f(n) = ( g(n) ) if f(n) grows a lot faster than g(n)
0)(
)(lim
ng
nf
n
)(
)(lim ng
nf
n
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
25
Prove
)(6 23 nn (1/2)n2 - 3n = (n2)
http://dcst2.east.asu.edu/~razdan/cst230/Razdan with contribution from others
26
Insertion Sort
• Link to Insertion Sort