Вы находитесь на странице: 1из 28

Amdahl’s and Gustafson’s laws

Jan Zapletal

VŠB - Technical University of Ostrava


jan.zapletal@vsb.cz

November 23, 2009

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 1 / 13
1 Contents

2 Introduction

3 Amdahl’s law

4 Gustafson’s law

5 Equivalence of laws

6 References

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 2 / 13
Performance analysis
How does the parallelization improve the performance of our program?

Metrics used to desribe the performance:


I execution time,
I speedup,
I efficiency,
I cost...

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13
Performance analysis
How does the parallelization improve the performance of our program?

Metrics used to desribe the performance:


I execution time,
I speedup,
I efficiency,
I cost...

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13
Metrics
Execution time
I The time elapsed from when the first processor starts the execution to
when the last processor completes it.
I On a parallel system consists of computation time, communication
time and idle time.
Speedup
I Defined as
T1
S= ,
Tp
where T1 is the execution time for a sequential system and Tp for the
parallel system.

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13
Metrics
Execution time
I The time elapsed from when the first processor starts the execution to
when the last processor completes it.
I On a parallel system consists of computation time, communication
time and idle time.
Speedup
I Defined as
T1
S= ,
Tp
where T1 is the execution time for a sequential system and Tp for the
parallel system.

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = t s + t p ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = t s + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = ts + tp ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = ts + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = ts + tp ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = ts + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np

Let f denote the sequential portion of the computation, i.e.


ts
f = .
ts + tp
Thus the speedup formula can be simplified into
1 1
S= 1−f
< .
f + N f

I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np

Let f denote the sequential portion of the computation, i.e.


ts
f = .
ts + tp
Thus the speedup formula can be simplified into
1 1
S= 1−f
< .
f + N f

I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np

Let f denote the sequential portion of the computation, i.e.


ts
f = .
ts + tp
Thus the speedup formula can be simplified into
1 1
S= 1−f
< .
f + N f

I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 7 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
On a sequential system we would get

T1 = ts∗ + N · tp∗ .

Thus the speedup will be


ts∗ + N · tp∗
S= .
ts∗ + tp∗

Let f ∗ denote the sequential portion of the computation on the parallel


system, i.e.
t∗
f ∗ = ∗ s ∗.
ts + tp
Then
S = f ∗ + N · (1 − f ∗ ).

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law
On a sequential system we would get

T1 = ts∗ + N · tp∗ .

Thus the speedup will be


ts∗ + N · tp∗
S= .
ts∗ + tp∗

Let f ∗ denote the sequential portion of the computation on the parallel


system, i.e.
t∗
f ∗ = ∗ s ∗.
ts + tp
Then
S = f ∗ + N · (1 − f ∗ ).

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law
On a sequential system we would get

T1 = ts∗ + N · tp∗ .

Thus the speedup will be


ts∗ + N · tp∗
S= .
ts∗ + tp∗

Let f ∗ denote the sequential portion of the computation on the parallel


system, i.e.
t∗
f ∗ = ∗ s ∗.
ts + tp
Then
S = f ∗ + N · (1 − f ∗ ).

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 10 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N

Hence
   
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N

Hence
   
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N

Hence
   
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np

what is exactly what Gustafson derived.


I The key is not to mix up the values f and f ∗ - this caused great
confusion that lasted over years!

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np

what is exactly what Gustafson derived.


I The key is not to mix up the values f and f ∗ - this caused great
confusion that lasted over years!

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np

what is exactly what Gustafson derived.


I The key is not to mix up the values f and f ∗ - this caused great
confusion that lasted over years!

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
References
QUINN, Michael Jay. Parallel programming in C with MPI and
OpenMP. New York : McGraw - Hill, 2004. 507 s.
Amdahl’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Amdahl’s law>.
Gustafson’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Gustafson’s law>.

Thank you for your attention!

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13
References
QUINN, Michael Jay. Parallel programming in C with MPI and
OpenMP. New York : McGraw - Hill, 2004. 507 s.
Amdahl’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Amdahl’s law>.
Gustafson’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Gustafson’s law>.

Thank you for your attention!

Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13

Вам также может понравиться