Вы находитесь на странице: 1из 23

Analysis of Algorithms

(pt 2)
(Chapter 4)
COMP53
Oct 3, 2007

Best, Worst and Average


Case

For a particular problem


size n, we can find:
Best case: the input that
can be solved the
fastest
Worst case: the input
that will take the longest
Average case: average
time for all inputs of the
same size.

Methods of Analysis
Experimental Studies: Run
experiments on implementations of
algorithms and record actual time or
operation counts.
Theoretical Analysis: Determine
time or operation counts from
mathematical analysis of the algorithm
doesnt require an implementation
(or even a computer)

Theoretical Analysis
Uses a high-level description of the
algorithm instead of an implementation
Characterizes running time as a
function of the input size, n.
Takes into account all possible inputs
Allows us to evaluate the speed of an
algorithm independent of the
hardware/software environment

Seven Important Functions


Seven functions that often appear in algorithm
analysis:

Constant 1
Logarithmic log n
Linear n
N-Log-N n log n
Quadratic n2
Cubic n3
Exponential 2n

In a log-log chart, the slope of the line


corresponds to the growth rate of the function

Growth Functions

Reasonable Time
Assume GHz machine: 106
operations/second
Minut Hour
Day
e
108 ops 109 ops 1011
ops

Month Year
1012
ops

1013
ops

Clearly, any algorithm requiring more


than 1014 operations is impractical

Growth Functions

< day

< month

> year

Big-Oh Notation
f(n) is O(g(n)) if there are positive constants c and n0
such that
f(n) cg(n) for n n0
Example:

2n 10 O(n)
2n 10 cn
for c 3 and n >= n010

Big-Oh Example
Example:

n2 is not O(n)
There is no value of c such
that n2 cn, as n

Comparing Growth Rates


f(n) O(g(n)) means that the growth
rate of f(n) is no more than the growth
rate of g(n)
g(n) is an upper bound on the value of
f(n)

Asymptotic Algorithm
Analysis
The asymptotic analysis of an algorithm

determines the running time in big-Oh


notation
To perform the asymptotic analysis

We find the worst-case number of basic


operations
as a function of the input size
We express this function with big-Oh notation

Example:
We determine that algorithm arrayMax executes
at most 8n 2 primitive operations
We say that algorithm arrayMax runs in O(n)
time

Example: Prefix Averages


The i-th prefix
average of an array
X is average of the
first (i 1) elements
of X:
A[i] X[0] X[1] X[i])/
(i+1)

This has applications


to financial analysis

35

X
A

30
25
20
15
10
5
0
1

2 3

5 6

Prefix Averages: Algorithm


1
Algorithm prefixAverages1(X, n)
Input array X of n integers
Output array A of prefix averages of X
A new array of n integers
for i 0 to n 1 do
s X[0]
for j 1 to i do
s s X[j]
A[i] s (i 1)
return A

basic operation:
addition

Algorithm 1 Analysis
for i 0 to n 1 do
for j 1 to i do
s s X[j]

Number of additions is 1 2 n-1


The sum of the first n-1 integers is (n1)n 2
Algorithm prefixAverages1 O(n2)

Prefix Averages (Algorithm


2)
Algorithm prefixAverages2(X, n)
Input array X of n integers
Output array A of prefix averages of X
A new array of n integers
s0
for i 0 to n 1 do
s s X[i]
A[i] s (i 1)
return A

basic operation:
addition

Algorithm 2 Analysis
for i 0 to n 1 do
s s + X[i]

Number of additions is 1 1 1 = n
Algorithm prefixAverages2 O(n)

Relatives of Big-Oh
big-Omega
f(n) (g(n))
if f(n) cg(n) for n n0
big-Theta
f(n) (g(n))
if cg(n) f(n) cg(n) for n
n0

Intuition for Asymptotic


Notation

f(n) O(g(n))

f(n) is asymptotically smaller or


equal to g(n)

f(n) (g(n))
f(n) is asymptotically bigger or
equal to g(n)

f(n) (g(n))

f(n) is asymptotically the same to


g(n)

Big-Oh

Big-Omega

Big-Theta

Example Uses and


5n2

is (n2)

5n2

is (n)

5n2

is (n2)

Вам также может понравиться