complexityAnalysis.ppt

download complexityAnalysis.ppt

of 26

Transcript of complexityAnalysis.ppt

  • 8/13/2019 complexityAnalysis.ppt

    1/26

    Analysis of Algorithms

    An algorithmis a step-by-step procedure forsolving a problem in a finite amount of time.

  • 8/13/2019 complexityAnalysis.ppt

    2/26

    Running Time

    Most algorithms transforminput objects into outputobjects.

    The running time of an

    algorithm typically grows withthe input size.

    Average case time is oftendifficult to determine.

    We focus on the worst caserunning time. Easier to analyze

    Crucial to applications such asgames, finance and robotics

    0

    20

    40

    60

    80

    100

    120

    RunningTime

    1000 2000 3000 4000

    Input Size

    best case

    average case

    worst case

    Analysis of Algorithms2

  • 8/13/2019 complexityAnalysis.ppt

    3/26

    Experimental Studies

    Write a programimplementing the algorithm

    Run the program with

    inputs of varying size andcomposition

    Use a function, like thebuilt-in clock()function, toget an accurate measure of

    the actual running time

    Plot the results0

    1000

    2000

    3000

    4000

    5000

    6000

    7000

    8000

    9000

    0 50 100

    Input Size

    Time(ms)

    Analysis of Algorithms3

  • 8/13/2019 complexityAnalysis.ppt

    4/26

    Limitations of Experiments

    Analysis of Algorithms4

    It is necessary to implement the algorithm, whichmay be difficult

    Results may not be indicative of the running time on

    other inputs not included in the experiment. In order to compare two algorithms, the same

    hardware and software environments must be used

  • 8/13/2019 complexityAnalysis.ppt

    5/26

    Theoretical Analysis

    Analysis of Algorithms5

    Uses a high-level description of the algorithm

    instead of an implementation

    Characterizes running time as a function of the

    input size, n.

    Takes into account all possible inputs

    Allows us to evaluate the speed of an algorithm

    independent of the hardware/software

    environment

  • 8/13/2019 complexityAnalysis.ppt

    6/26

  • 8/13/2019 complexityAnalysis.ppt

    7/26

    Pseudocode Details

    Analysis of Algorithms7

    Control flow

    ifthen[else]

    whiledo

    repeatuntil fordo

    Indentation replaces braces

    Method declaration

    Algorithm method(arg[, arg])Input

    Output

    Method/Function callvar.method (arg[, arg])

    Return value

    returnexpression ExpressionsAssignment

    (like in C++)

    Equality testing

    (like in C++)n2Superscripts and other

    mathematical formattingallowed

  • 8/13/2019 complexityAnalysis.ppt

    8/26

    The Random Access Machine (RAM)

    Model

    A CPU

    An potentially unboundedbank of memorycells,

    each of which can hold an

    arbitrary number or

    character

    Analysis of Algorithms8

    012

    Memory cells are numbered and accessingany cell in memory takes unit time.

  • 8/13/2019 complexityAnalysis.ppt

    9/26

    Primitive Operations

    Analysis of Algorithms9

    Basic computations

    performed by an algorithm

    Identifiable in pseudocode

    Largely independent from theprogramming language

    Exact definition not important

    (we will see why later)

    Assumed to take a constant

    amount of time in the RAM

    model

    Examples:

    Evaluating an

    expression

    Assigning a value toa variable

    Indexing into an

    array

    Calling a method

    Returning from amethod

  • 8/13/2019 complexityAnalysis.ppt

    10/26

    Counting Primitive

    Operations

    Analysis of Algorithms10

    By inspecting the pseudocode, we can determine themaximum number of primitive operations executed by analgorithm, as a function of the input size

    AlgorithmarrayMax(A, n) # operationscurrentMaxA[0] 2fori1ton1do 2+n

    ifA[i] currentMaxthen 2(n1)currentMaxA[i] 2(n1)

    { increment counter i} 2(n1)returncurrentMax 1

    Total 7n1

  • 8/13/2019 complexityAnalysis.ppt

    11/26

    Running Time Calculation

    Analysis of Algorithms11

    Rule1ForLoops

    Therunningtimeofaforloopisatmosttherunningtimeofthestatementinsidetheforloop(includingtests)timesthenumberofiterations

    Rule2NestedLoopsAnalyzetheseinsideout.Thetotalrunning

    timeofastatementinsideagroupofnestedloopsistherunningtimeofthestatementmultipliedbytheproductofthesizesofalltheloops

  • 8/13/2019 complexityAnalysis.ppt

    12/26

    Running Time Calculation

    Example

    Analysis of Algorithms12

    Example1:sum = 0;

    for (i=1; i

  • 8/13/2019 complexityAnalysis.ppt

    13/26

    Estimating Running Time

    Analysis of Algorithms13

    Algorithm arrayMaxexecutes 7n1 primitive

    operations in the worst case. Define:

    a= Time taken by the fastest primitive operation

    b = Time taken by the slowest primitive operation

    Let T(n)be worst-case time of arrayMax.Then

    a (7n1) T(n)b(7n1)

    Hence, the running time T(n)is bounded by two

    linear functions

  • 8/13/2019 complexityAnalysis.ppt

    14/26

    Growth Rate of Running Time

    Analysis of Algorithms14

    Changing the hardware/ software environment

    Affects T(n)by a constant factor, but

    Does not alter the growth rate of T(n)

    The linear growth rate of the running time T(n)is

    an intrinsic property of algorithm arrayMax

  • 8/13/2019 complexityAnalysis.ppt

    15/26

    Growth Rates

    Growth rates of

    functions:

    Linear n

    Quadratic n2

    Cubic n3

    In a log-log chart,

    the slope of the line

    corresponds to the

    growth rate of the

    function

    Analysis of Algorithms15

    1E-1

    1E+1

    1E+31E+5

    1E+7

    1E+9

    1E+11

    1E+13

    1E+15

    1E+171E+19

    1E+21

    1E+23

    1E+25

    1E+27

    1E+29

    1E-1 1E+1 1E+3 1E+5 1E+7 1E+9

    T(n)

    n

    Cubic

    Quadratic

    Linear

  • 8/13/2019 complexityAnalysis.ppt

    16/26

    Constant Factors

    Analysis of Algorithms16

    The growth rate is

    not affected by

    constant factors or

    lower-order terms

    Examples

    102n 105is a linear

    function

    10

    5n2

    10

    8nis aquadratic function

    1E-1

    1E+11E+3

    1E+5

    1E+7

    1E+9

    1E+11

    1E+13

    1E+15

    1E+17

    1E+19

    1E+21

    1E+23

    1E+25

    1E-1 1E+2 1E+5 1E+8

    T(n)

    n

    Quadratic

    Quadratic

    Linear

    Linear

  • 8/13/2019 complexityAnalysis.ppt

    17/26

    Big-Oh Notation

    Analysis of Algorithms17

    Given functions f(n) and

    g(n), we say that f(n) is

    O(g(n))if there are

    positive constants

    cand n0such that

    f(n)cg(n) for n n0

    Example: 2n+10is O(n)

    2n+10cn

    (c2) n 10

    n 10/(c2)

    Pick c 3 and n0 10

    1

    10

    100

    1,000

    10,000

    1 10 100 1,000

    n

    3n

    2n+10

    n

  • 8/13/2019 complexityAnalysis.ppt

    18/26

    Big-Oh Example

    Analysis of Algorithms18

    Example: the function

    n2is not O(n)

    n2cn

    n c The above inequality

    cannot be satisfied since

    cmust be a constant

    1

    10

    100

    1,000

    10,000

    100,000

    1,000,000

    1 10 100 1,000n

    n^2

    100n

    10n

    n

  • 8/13/2019 complexityAnalysis.ppt

    19/26

    Analysis of Algorithms19

    More Big-Oh Examples

    7n-27n-2 is O(n)

    need c > 0 and n01 such that 7n-2 cn for n n0

    this is true for c = 7 and n0= 1

    3n3+ 20n2+ 53n3+ 20n2+ 5 is O(n3)

    need c > 0 and n01 such that 3n3+ 20n2+ 5 cn3for n n0

    this is true for c = 4 and n0= 21

    3 log n + log log n3 log n + log log n is O(log n)

    need c > 0 and n01 such that 3 log n + log log n clog n for n n0

    this is true for c = 4 and n0= 2

  • 8/13/2019 complexityAnalysis.ppt

    20/26

    Big-Oh and Growth Rate

    Analysis of Algorithms20

    The big-Oh notation gives an upper bound on the

    growth rate of a function

    The statement f(n) is O(g(n)) means that the growth

    rate of f(n) is no more than the growth rate of g(n) We can use the big-Oh notation to rank functions

    according to their growth rate

    f(n) is O(g(n)) g(n) is O(f(n))

    g(n) grows more Yes No

    f(n) grows more No Yes

    Same growth Yes Yes

  • 8/13/2019 complexityAnalysis.ppt

    21/26

  • 8/13/2019 complexityAnalysis.ppt

    22/26

    Asymptotic Algorithm Analysis

    Analysis of Algorithms22

    The asymptotic analysis of an algorithm determinesthe running time in big-Oh notation

    To perform the asymptotic analysis We find the worst-case number of primitive operations

    executed as a function of the input size We express this function with big-Oh notation

    Example: We determine that algorithm arrayMaxexecutes at most 7n1 primitive operations

    We say that algorithm arrayMaxruns in O(n) time Since constant factors and lower-order terms are

    eventually dropped anyhow, we can disregard themwhen counting primitive operations

  • 8/13/2019 complexityAnalysis.ppt

    23/26

    Math you need to Review

    properties of logarithms:

    logb(xy) = logbx + logby

    logb(x/y) = logbx - logbylogbxa = alogbx

    logba = logxa/logxb

    properties of exponentials:a(b+c)= aba c

    abc= (ab)cab/ac= a(b-c)b = a logabbc= a c*logab

    Analysis of Algorithms23

    Summations

    Logarithms and Exponents

    Proof techniques

    Basic probability

  • 8/13/2019 complexityAnalysis.ppt

    24/26

    Analysis of Algorithms24

    Relatives of Big-Oh

    big-Omega

    f(n) is (g(n)) if there is a constant c > 0

    and an integer constant n01 such that

    f(n) cg(n) for n n0

    big-Theta f(n) is (g(n)) if there are constants c > 0 and c > 0 and an

    integer constant n01 such that cg(n) f(n) cg(n) for n n0

    little-oh

    f(n) is o(g(n)) if, for any constant c > 0, there is an integerconstant n

    00 such that f(n) cg(n) for n n

    0little-omega

    f(n) is (g(n)) if, for any constant c > 0, there is an integerconstant n00 such that f(n) cg(n) for n n0

  • 8/13/2019 complexityAnalysis.ppt

    25/26

    Intuition for Asymptotic

    Notation

    Analysis of Algorithms25

    Big-Oh

    f(n) is O(g(n)) if f(n) is asymptotically less than or equalto g(n)

    big-Omega

    f(n) is (g(n)) if f(n) is asymptotically greater than or equalto g(n)big-Theta

    f(n) is (g(n)) if f(n) is asymptotically equalto g(n)

    little-oh

    f(n) is o(g(n)) if f(n) is asymptotically strictly lessthan g(n)little-omega

    f(n) is (g(n)) if is asymptotically strictly greaterthan g(n)

  • 8/13/2019 complexityAnalysis.ppt

    26/26

    Analysis of Algorithms26

    Example Uses of theRelatives of Big-Oh

    f(n) is (g(n)) if, for any constant c > 0, there is an integer constant n00 such that f(n) cg(n) for n n0

    need 5n02cn0given c, the n0that satisfies this is n0c/5 0

    5n2is (n)

    f(n) is (g(n)) if there is a constant c > 0 and an integer constant n01such that f(n) cg(n) for n n0

    let c = 1 and n0= 1

    5n2is (n)

    f(n) is (g(n)) if there is a constant c > 0 and an integer constant n01such that f(n) cg(n) for n n0

    let c = 5 and n0= 1

    5n2is (n2)