Amdahl’s and Gustafson’s laws - vsb.czhomel.vsb.cz/~zap150/pa/ref/Gustafson.pdf1 Contents 2...

Post on 20-Apr-2018

216 views 2 download

Transcript of Amdahl’s and Gustafson’s laws - vsb.czhomel.vsb.cz/~zap150/pa/ref/Gustafson.pdf1 Contents 2...

Amdahl’s and Gustafson’s laws

Jan Zapletal

VSB - Technical University of Ostravajan.zapletal@vsb.cz

November 23, 2009

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 1 / 13

1 Contents

2 Introduction

3 Amdahl’s law

4 Gustafson’s law

5 Equivalence of laws

6 References

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 2 / 13

Performance analysis

How does the parallelization improve the performance of our program?

Metrics used to desribe the performance:

I execution time,

I speedup,

I efficiency,

I cost...

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13

Performance analysis

How does the parallelization improve the performance of our program?

Metrics used to desribe the performance:

I execution time,

I speedup,

I efficiency,

I cost...

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13

Metrics

Execution time

I The time elapsed from when the first processor starts the execution towhen the last processor completes it.

I On a parallel system consists of computation time, communicationtime and idle time.

Speedup

I Defined as

S =T1

Tp,

where T1 is the execution time for a sequential system and Tp for theparallel system.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13

Metrics

Execution time

I The time elapsed from when the first processor starts the execution towhen the last processor completes it.

I On a parallel system consists of computation time, communicationtime and idle time.

Speedup

I Defined as

S =T1

Tp,

where T1 is the execution time for a sequential system and Tp for theparallel system.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13

Amdahl’s law

Gene Myron Amdahl (born November 16, 1922)I worked for IBM,I best known for formulating Amdahl’s law uncovering the limits of

parallel computing.

Let T1 denote the computation time on a sequential system. We can splitthe total time as follows

T1 = ts + tp,

whereI ts - computation time needed for the sequential part.I tp - computation time needed for the parallel part.

Clearly, if we parallelize the problem, only tp can be reduced. Assumingideal parallelization we get

Tp = ts +tpN

,

whereI N - number of processors.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13

Amdahl’s law

Gene Myron Amdahl (born November 16, 1922)I worked for IBM,I best known for formulating Amdahl’s law uncovering the limits of

parallel computing.

Let T1 denote the computation time on a sequential system. We can splitthe total time as follows

T1 = ts + tp,

whereI ts - computation time needed for the sequential part.I tp - computation time needed for the parallel part.

Clearly, if we parallelize the problem, only tp can be reduced. Assumingideal parallelization we get

Tp = ts +tpN

,

whereI N - number of processors.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13

Amdahl’s law

Gene Myron Amdahl (born November 16, 1922)I worked for IBM,I best known for formulating Amdahl’s law uncovering the limits of

parallel computing.

Let T1 denote the computation time on a sequential system. We can splitthe total time as follows

T1 = ts + tp,

whereI ts - computation time needed for the sequential part.I tp - computation time needed for the parallel part.

Clearly, if we parallelize the problem, only tp can be reduced. Assumingideal parallelization we get

Tp = ts +tpN

,

whereI N - number of processors.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13

Amdahl’s law

Thus we get the speedup of

S =T1

Tp=

ts + tp

ts +tpN

.

Let f denote the sequential portion of the computation, i.e.

f =ts

ts + tp.

Thus the speedup formula can be simplified into

S =1

f + 1−fN

<1

f.

I Notice that Amdahl assumes the problem size does not change withthe number of CPUs.

I Wants to solve a fixed-size problem as quickly as possible.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13

Amdahl’s law

Thus we get the speedup of

S =T1

Tp=

ts + tp

ts +tpN

.

Let f denote the sequential portion of the computation, i.e.

f =ts

ts + tp.

Thus the speedup formula can be simplified into

S =1

f + 1−fN

<1

f.

I Notice that Amdahl assumes the problem size does not change withthe number of CPUs.

I Wants to solve a fixed-size problem as quickly as possible.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13

Amdahl’s law

Thus we get the speedup of

S =T1

Tp=

ts + tp

ts +tpN

.

Let f denote the sequential portion of the computation, i.e.

f =ts

ts + tp.

Thus the speedup formula can be simplified into

S =1

f + 1−fN

<1

f.

I Notice that Amdahl assumes the problem size does not change withthe number of CPUs.

I Wants to solve a fixed-size problem as quickly as possible.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13

Amdahl’s law

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 7 / 13

Gustafson’s law

John L. Gustafson (born January 19, 1955)

I American computer scientist and businessman,

I found out that practital problems show much better speedup thanAmdahl predicted.

Gustafson’s law

I The computation time is constant (instead of the problem size),

I increasing number of CPUs ⇒ solve bigger problem and get betterresults in the same time.

Let Tp denote the computation time on a parallel system. We can split thetotal time as follows

Tp = t∗s + t∗p ,

where

I t∗s - computation time needed for the sequential part.

I t∗p - computation time needed for the parallel part.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13

Gustafson’s law

John L. Gustafson (born January 19, 1955)

I American computer scientist and businessman,

I found out that practital problems show much better speedup thanAmdahl predicted.

Gustafson’s law

I The computation time is constant (instead of the problem size),

I increasing number of CPUs ⇒ solve bigger problem and get betterresults in the same time.

Let Tp denote the computation time on a parallel system. We can split thetotal time as follows

Tp = t∗s + t∗p ,

where

I t∗s - computation time needed for the sequential part.

I t∗p - computation time needed for the parallel part.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13

Gustafson’s law

John L. Gustafson (born January 19, 1955)

I American computer scientist and businessman,

I found out that practital problems show much better speedup thanAmdahl predicted.

Gustafson’s law

I The computation time is constant (instead of the problem size),

I increasing number of CPUs ⇒ solve bigger problem and get betterresults in the same time.

Let Tp denote the computation time on a parallel system. We can split thetotal time as follows

Tp = t∗s + t∗p ,

where

I t∗s - computation time needed for the sequential part.

I t∗p - computation time needed for the parallel part.

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13

Gustafson’s law

On a sequential system we would get

T1 = t∗s + N · t∗p .

Thus the speedup will be

S =t∗s + N · t∗p

t∗s + t∗p.

Let f ∗ denote the sequential portion of the computation on the parallelsystem, i.e.

f ∗ =t∗s

t∗s + t∗p.

ThenS = f ∗ + N · (1− f ∗).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13

Gustafson’s law

On a sequential system we would get

T1 = t∗s + N · t∗p .

Thus the speedup will be

S =t∗s + N · t∗p

t∗s + t∗p.

Let f ∗ denote the sequential portion of the computation on the parallelsystem, i.e.

f ∗ =t∗s

t∗s + t∗p.

ThenS = f ∗ + N · (1− f ∗).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13

Gustafson’s law

On a sequential system we would get

T1 = t∗s + N · t∗p .

Thus the speedup will be

S =t∗s + N · t∗p

t∗s + t∗p.

Let f ∗ denote the sequential portion of the computation on the parallelsystem, i.e.

f ∗ =t∗s

t∗s + t∗p.

ThenS = f ∗ + N · (1− f ∗).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13

Gustafson’s law

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 10 / 13

What the hell?!

I The bigger the problem, the smaller f - serial part remains usualy thesame,

I and f 6= f ∗.

Amdahl’s says:

S =ts + tp

ts +tpN

.

Let now f ∗ denote the sequential portion spent in the parallelcomputation, i.e.

f ∗ =ts

ts +tpN

and (1− f ∗) =tpN

ts +tpN

.

Hence

ts = f ∗ ·(

ts +tpN

)and tp = N · (1− f ∗) ·

(ts +

tpN

).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13

What the hell?!

I The bigger the problem, the smaller f - serial part remains usualy thesame,

I and f 6= f ∗.

Amdahl’s says:

S =ts + tp

ts +tpN

.

Let now f ∗ denote the sequential portion spent in the parallelcomputation, i.e.

f ∗ =ts

ts +tpN

and (1− f ∗) =tpN

ts +tpN

.

Hence

ts = f ∗ ·(

ts +tpN

)and tp = N · (1− f ∗) ·

(ts +

tpN

).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13

What the hell?!

I The bigger the problem, the smaller f - serial part remains usualy thesame,

I and f 6= f ∗.

Amdahl’s says:

S =ts + tp

ts +tpN

.

Let now f ∗ denote the sequential portion spent in the parallelcomputation, i.e.

f ∗ =ts

ts +tpN

and (1− f ∗) =tpN

ts +tpN

.

Hence

ts = f ∗ ·(

ts +tpN

)and tp = N · (1− f ∗) ·

(ts +

tpN

).

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13

I see!

I After substituting ts and tp into the Amdahl’s formula one gets

S =ts + tp

ts +tpN

= f ∗ + N · (1− f ∗),

what is exactly what Gustafson derived.

I The key is not to mix up the values f and f ∗ - this caused greatconfusion that lasted over years!

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13

I see!

I After substituting ts and tp into the Amdahl’s formula one gets

S =ts + tp

ts +tpN

= f ∗ + N · (1− f ∗),

what is exactly what Gustafson derived.

I The key is not to mix up the values f and f ∗ - this caused greatconfusion that lasted over years!

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13

I see!

I After substituting ts and tp into the Amdahl’s formula one gets

S =ts + tp

ts +tpN

= f ∗ + N · (1− f ∗),

what is exactly what Gustafson derived.

I The key is not to mix up the values f and f ∗ - this caused greatconfusion that lasted over years!

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13

References

QUINN, Michael Jay. Parallel programming in C with MPI andOpenMP. New York : McGraw - Hill, 2004. 507 s.

Amdahl’s law [online]. Available at:<http://en.wikipedia.org/wiki/Amdahl’s law>.

Gustafson’s law [online]. Available at:<http://en.wikipedia.org/wiki/Gustafson’s law>.

Thank you for your attention!

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13

References

QUINN, Michael Jay. Parallel programming in C with MPI andOpenMP. New York : McGraw - Hill, 2004. 507 s.

Amdahl’s law [online]. Available at:<http://en.wikipedia.org/wiki/Amdahl’s law>.

Gustafson’s law [online]. Available at:<http://en.wikipedia.org/wiki/Gustafson’s law>.

Thank you for your attention!

Jan Zapletal (VSB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13