Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
-
Upload
lauren-washington -
Category
Documents
-
view
215 -
download
0
Transcript of Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Time Complexity
Dr. Jicheng Fu
Department of Computer ScienceUniversity of Central Oklahoma
Objectives (Section 7.6)
The concepts of space complexity and time complexity
Use the step count to derive a function of the time complexity of a program
Asymptotics and orders of magnitude The big-O and related notations Time complexity of recursive algorithms
Motivation
3 5 10 16 20 … 22 28 36 60 arr: sorted
n
Evaluate An Algorithm
Two important measures to evaluate an algorithm Space complexity Time complexity
Space complexity The maximum storage space needed for an
algorithm Expressed as a function of the problem size Relatively easy to evaluate
Time complexity Determining the number of steps (operations)
needed as a function of the problem size Our focus
Step Count
Count the exact number of steps needed for an algorithm as a function of the problem size
Each atomic operation is counted as one step: Arithmetic operations Comparison operations Other operations, such as “assignment” and
“return”
22
7
2
3)1(322 2
1 nninn
n
iThe running time is
1
n
iin
1)1(
n
iin
1)1(2
n21
Algorithm 11 int count_1(int n)2 { 3 sum = 0
4 for i=1 to n {
5 for j=i to n {
6 sum++7 }8 }9 return sum10 }
n
i
n
inniin
112/)1()1(Note:
1 int count_2(int n)2 { 3 sum = 0
4 for i=1 to n {
5 sum += n+1-i
6 }7 return sum8 }
n21
1
n3
The running time is 25 n
Algorithm 2
1 int count_3(int n)2 { 3 sum = n(n+1)/2
4 return sum5 }
4
The running time is 5 time unit
1
Algorithm 3
Asymptotics
An exact step count is usually unnecessary Too dependent on programming languages and
programmer’s style But make little difference in whether the algorithm is
feasible or not A change in fundamental method can make a
vital difference If the number of operations is proportional to n,
then double n will double the running time If the number of operations is proportional to 2n,
doubling n will square the number of operations
Example: Assume that a computation that takes 1
second may involve 106 operations Also assume that double the problem size will
require 1012 operations Increase running time from 1 second to 11.5
days 1012 operations / 106 operations per second = 106
second 11.5 days
Instead of an exact step count, we want a notation that accurately reflects the increase of computation
time with the size, but ignores details that has little effect on the total
Asymptotics: the study of functions of a parameter n, as n becomes larger and larger without bound
Orders of Magnitude
The idea: Suppose function f(n) measures the amount of
work done by an algorithm on a problem of size n Compare f(n) for large values of n, with some
well-known function g(n) whose behavior we already understand
To compare f(n) against g(n): take the quotient f(n) / g(n), and take the limit of the quotient as n increases
without bound
Definition If then:
f(n) has strictly smaller order of magnitude than g(n).
If is finite and nonzero then:
f(n) has the same order of magnitude as g(n).
If then:
f(n) has strictly greater order of magnitude than g(n).
0)(
)(lim
ng
nfn
)(
)(lim
ng
nfn
)(
)(lim
ng
nfn
Common choices for g(n): g(n) = 1 Constant function g(n) = log n Logarithmic function g(n) = n Linear function g(n) = n2 Quadratic function g(n) = n3 Cubic function g(n) = 2n Exponential function
Notes: The second case, when f(n) and g(n) have the
same order of magnitude, includes all values of the limit except 0 and Changing the running time of an algorithm by any
nonzero constant factor will not affect its order of magnitude
Polynomials If f(n) is a polynomial in n with degree r , then
f(n) has the same order of magnitude as nr If r < s, then nr has strictly smaller order of
magnitude than ns
Example 1:
3n2 - 100n - 25 has strictly smaller order than n3
0251003
lim)(
)(lim
3
2
n
nn
ng
nfnn
251003)( 2 nnnf3)( nng
Example 2:
3n2 - 100n - 25 has strictly greater order than n
Example 3:
3n2 - 100n - 25 has the same order as n2
n
nn
ng
nfn
251003
)(
)(lim
2
251003)( 2 nnnf nng )(
3251003
)(
)(lim
2
2
n
nn
ng
nfn
251003)( 2 nnnf2)( nng
Logarithms The order of magnitude of a logarithm does not
depend on the base for the logarithms Let loga n and logb n be logarithms to two different bases a
> 1 and b > 1
Since the base for logarithms makes no difference to the order of magnitude, we just generally write log without a base
a
n
na
n
nb
a
ab
na
b
nlog
log
logloglim
log
loglim
Compare the order of magnitude of a logarithm log n with a power of n, say nr (r > 0) It is difficult to calculate the quotient log n / nr
Need some mathematical tool
L’Hôpital’s Rule Suppose that: f(n) and g(n) are differentiable functions for
all sufficiently large n, with derivatives f’(n) and g’(n), respectively
and
exists
Then exists and
)(lim nfn
)(lim ngn
)(
)(lim
ng
nfn
)(
)(lim
ng
nfn )(
)(lim
)(
)(lim
ng
nf
ng
nfnn
Use L’Hôpital’s Rule
Conclusion log n has strictly smaller order of magnitude than
any positive power nr of n, r > 0.
01
lim1
lim)(
)(lim
lnlim
)(
)(lim
1
rnrnnrnn rnrn
n
ng
nf
n
n
ng
nf
nnf ln)( 0 ,)( rnng r
Exponential Functions
Compare the order of magnitude of an exponential function an with a power of n, and nr (r > 0)
Use L’Hôpital’s Rule again (pp. 308) Conclusion:
Any exponential function an for any real number a > 1 has strictly greater order of magnitude than any power nr of n, for any positive integer r
Compare the order of magnitude of two exponential functions with different bases, an and bn Assume 0 a < b,
Conclusion: If 0 a < b then an has strictly smaller order of
magnitude than bn
0limlim
n
nn
n
n b
a
b
a
Common Orders
For most algorithm analyses, only a short list of functions is needed 1 (constant), log n (logarithmic), n (linear), n2 (quadratic), n3
(cubic), 2n (exponential) They are in strictly increasing order of magnitude
One more important function: n log n (see pp. 309) The order of some advanced sorting algorithms n log n has strictly greater order of magnitude than n n log n has strictly smaller order of magnitude than any
power nr for any r > 1
Growth Rate of Common Functions
The Big-O and Related Notations
These notations are pronounced “little oh”, “Big Oh”, “Big Theta”, and “Big Omega”, respectively.
Examples On a list of length n, sequential search has running time
(n) On an ordered list of length n, binary search has running
time (log n) Retrieval from a contiguous list of length n has running time
O(1) Retrieval from a linked list of length n has running time
O(n). Any algorithm that uses comparisons of keys to search a
list of length n must make (log n) comparisons of keys
If f(n) is a polynomial in n of degree r , then f(n) is (nr)
If r < s, then nr is o(ns) If a > 1 and b > 1, then loga(n) is (logb(n))
log n is o(nr) for any r > 0 For any real number a > 1 and any positive
integer r, nr is o(an) If 0 a < b then an is o(bn)
O(n)
1 int count_0(int n)2 { 3 sum = 04 for i=1 to n {
5 for j=1 to n {
6 If i<=j then
7 sum++8 }9 }10 return sum11 }
O(1)
O(1)
The running time is O(n2)
O(n2)O(n2)
O(n2)
Algorithm 4
Summary of Running Times
Algorithm Running Time Order of Running Time
Algorithm 1 n2
Algorithm 2 5n+2 n
Algorithm 3 5 Constant
22
7
2
3 2 nn
Asymptotic Running Times
Algorithm Running Time Asymptotic Bound
Algorithm 1 O(n2)
Algorithm 2 5n+2 O(n)
Algorithm 3 5 O(1)
Algorithm 4 - O(n2)
22
7
2
3 2 nn
More Examples
1)
int x = 0;
for (int i = 0; i < 100; i++)
x += i;
2)
int x = 0;
for (int i = 0; i < n2; i++)
x += i;
* Assume that the value of n is the size of the problem
3)
int x = 0;
for (int i = 1; i < n; i *= 2)
x += i;
4)
int x = 0;
for (int i = 1; i < n; i++)
for (int j = 1; j < i; j++)
x += i + j;
5)
int x = 0;
for (int i = 1; i < n; i++)
for (int j = i; j < 100; j++)
x += i + j;
6)
int x = 0;
for (int i = 1; i < n; i++)
for (int j = n; j > i; j /= 3)
x += i + j;
7)
int x = 0;
for (int i = 1; i < n * n; i++)
for (int j = 1; j < i; j++)
x += i + j;
Review: Arithmetic Sequences/Progressions An arithmetic sequence is a sequence of
numbers such that the difference of any two successive members of the sequence is a constant
If the first term of an arithmetic sequence is a1 and the common difference of successive members is d, then the nth term an of the sequence is:
dnaan 11
Analyzing Recursive Algorithms Often a recurrence equation is used as the starting
point to analyze a recursive algorithm In the recurrence equation, T(n) denotes the running time
of the recursive algorithm for an input of size n We will try to convert the recurrence equation into a
closed form equation to have a better understanding of the time complexity Closed Form: No reference to T(n) on the right side of the
equation Conversions to the closed form solution can be very
challenging
Example: Factorial
int factorial (int n) /* Pre: n is an integer no less than 0 Post: The factorial of n (n!) is returned Uses: The function factorial recursively */{
if (n == 0) return 1;
elsereturn n * factorial (n - 1);
}}
3)1( nT
11
The time complexity of factorial(n) is:
T(n) is an arithmetic sequence with the common difference 4 of successive members and T(0) equals 2
The time complexity of factorial is O(n)
0 if4)1(
0 if 2)(
nnT
nnT
nndTnT 42)0()(
3+1: The comparison is included
Recurrence Equations Examples Divide and conquer: Recursive merge sorting
template <class Record>void Sortable_list<Record> :: recursive_merge_sort(
int low, int high)/* Post: The entries of the sortable list between index low and high
have been rearranged so that their keys are sorted into non-decreasing order.
Uses: The contiguous List*/{ if (high > low) { recursive_merge_sort(low, (high + low) / 2); recursive_merge_sort((high + low) / 2 + 1, high); merge(low, high); }}
The time complexity of recursive_merge_sort is:
To obtain a closed form equation for T(n), we assume n is a power of 2
When i = log2n, we have:
The time complexity is O(nlogn)
1 if)2/()2/(
1 if 1)(
ncnnTnT
nnT
icnnT
cnnTcnnT
cncnnTcnnTnT
ii
)2/(2
3)2/(22)2/(2
)2/)2/(2(2)2/(2)(3322
2
ncnn
ncnnTcnnnTnT nn
log
log)1()(log)2/(2)( loglog
Fibonacci numbersint fibonacci(int n)
/* fibonacci : recursive version */
{if (n <= 0) return 0;
else if (n == 1) return 1;
else return fibonacci(n − 1) + fibonacci(n − 2);
}
The time complexity of fibonacci is:
Theorem (in Section A.4): If F(n) is defined by a Fibonacci sequence, then F(n) is (gn), where
The time complexity is exponential: O(gn)
1 if
1 if
0 if
6)2()1(
3
2
)(
n
n
n
nTnT
nT
2/)51( g