240-6a-Big Oh
-
Upload
saiindra-reddy -
Category
Documents
-
view
218 -
download
0
Transcript of 240-6a-Big Oh
-
8/3/2019 240-6a-Big Oh
1/13
Algorithm Analysis (Big O)
CS-341
Dick Steflik
-
8/3/2019 240-6a-Big Oh
2/13
Complexity
In examining algorithm efficiency we must
understand the idea of complexity
Space complexity
Time Complexity
-
8/3/2019 240-6a-Big Oh
3/13
Space Complexity
When memory was expensive we focusedon
making programs as space efficient as possible
anddeveloped schemes to make memory
appearlargerthan it really was (virtual memoryand memory paging schemes)
Space complexity is still important in the fieldof embedded computing (hand held computer
based equipment like cell phones, palm devices,
etc)
-
8/3/2019 240-6a-Big Oh
4/13
Time Complexity
Is the algorithm fast enough formy needs
How much longerwill the algorithm take if
I increase the amount ofdata it must
process
Given a set of algorithms that accomplish
the same thing, which is the right one to
choose
-
8/3/2019 240-6a-Big Oh
5/13
Algorithm Efficiency
a measure of the amount ofresources consumed insolving a problem of size n
time
space
Benchmarking: implement algorithm,
run with some specific input and measure time taken
betterforcomparing performance of processors than for
comparing performance of algorithms
Big Oh (asymptotic analysis)
associates n, the problem size,
with t, the processing time required to solve the problem
-
8/3/2019 240-6a-Big Oh
6/13
Cases to examine
Best case
if the algorithm is executed, the fewest number
of instructions are executed
Average caseexecuting the algorithm produces path lengths
that will on average be the same
Worst case
executing the algorithm produces path lengths
that are always a maximum
-
8/3/2019 240-6a-Big Oh
7/13
Worst case analysis
Of the three cases, only useful case (from
the standpoint of program design) is that of
the worst case.
Worst case helps answerthe software
lifecycle questionof:
If its good enough today, will it be goodenough tomorrow?
-
8/3/2019 240-6a-Big Oh
8/13
Frequency Count
examine a piece of code and predict the
numberof instructions to be executed
e.g.
Code
for(int i=0; i< n ; i++)
{ cout
-
8/3/2019 240-6a-Big Oh
9/13
Order of magnitude
In the previous example: best_case = avg_case = worst_case
Example is basedon fixed iterationn
By itself, Freq. Count is relatively meaningless
Orderof magnitude -> estimate of performance vs. amount
ofdata
To convert F.C. toorderof magnitude:
discard constant terms
disregard coefficients
pick the most significant term
Worst case path through algorithm ->
orderof magnitude will be Big O (i.e. O(n))
-
8/3/2019 240-6a-Big Oh
10/13
Another example
Code
for(int i=0; i< n ; i++)
forint j=0 ; j < n; j++)
{ cout
-
8/3/2019 240-6a-Big Oh
11/13
What is Big O
Big O
rate at which algorithm performance degrades
as a functionof the amount ofdata it is asked to
handle Forexample:
O(n) -> performance degrades at a linearrate
O(n2) -> quadratic degradation
-
8/3/2019 240-6a-Big Oh
12/13
Common growth rates
-
8/3/2019 240-6a-Big Oh
13/13
Big Oh - Formal Definition
Definitionof "bigoh":
f(n)=O(g(n)), iff there exist constants c andn0 such that: f(n) =n0
Thus, g(n) is an upperboundon f(n) Note:
f(n) = O(g(n))is NOT the same as
O(g(n)) = f(n)
The '=' is not the usual mathematicaloperator"=" (it is not reflexive)