Вы находитесь на странице: 1из 15

Asymptotic Notation

Analysis of Algorithms
 An algorithm is a finite set of precise
instructions for performing a computation or for
solving a problem.
 What is the goal of analysis of algorithms?
 To compare algorithms mainly in terms of running
time but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
 What do we mean by running time analysis?
 Determine how running time increases as the size
of the problem increases.
asymp - 1
Input Size

 Input size (number of elements in the input)


 size of an array

 polynomial degree

 # of elements in a matrix

 # of bits in the binary representation of the input

 vertices and edges in a graph

asymp - 2

Types of Analysis
 Worst case
 Provides an upper bound on running time
 An absolute guarantee that the algorithm would not run longer, no
matter what the inputs are
 Best case
 Provides a lower bound on running time
 Input is the one for which the algorithm runs the fastest
 Average case
 Provides a prediction about the running time
 Assumes that the input is random

asymp - 3
How do we compare algorithms?
 We need to define a number of objective
measures.
(1) Compare execution times?
Not good: times are specific to a particular
computer !!
(2) Count the number of statements executed?
Not good: number of statements vary with
the programming language as well as the
style of the individual programmer.

asymp - 4

Ideal Solution

 Express running time as a function of the input


size n (i.e., f(n)).
 Compare different functions corresponding to
running times.
 Such an analysis is independent of machine
time, programming style, etc.

asymp - 5
Example
 Associate a "cost" with each statement.
 Find the "total cost“ by finding the total number of times each
statement is executed.
Algorithm 1 Algorithm 2

Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

asymp - 6

Another Example

 Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2

asymp - 7
Asymptotic Analysis
 To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
 Hint: use rate of growth
 Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)

asymp - 8

Rate of Growth
 Consider the example of buying elephants and
goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
 The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4


have the same rate of growth
asymp - 9
Asymptotic Notation
 O notation: asymptotic “less than”:

 f(n)=O(g(n)) implies: f(n) “≤” g(n)

  notation: asymptotic “greater than”:

 f(n)=  (g(n)) implies: f(n) “≥” g(n)

  notation: asymptotic “equality”:

 f(n)=  (g(n)) implies: f(n) “=” g(n)


asymp - 10

Asymptotic Complexity
 Running time of an algorithm as a function of
input size n for large n.
 Expressed using only the highest-order term in
the expression for the exact running time.
 Instead of exact running time, say (n2).
 Describes behavior of function in the limit.
 Written using Asymptotic Notation.

asymp - 11
Asymptotic Notation
 , O, , o, w
 Defined for functions over the natural numbers.
 Ex: f(n) = (n2).
 Describes how f(n) grows in comparison to n2.
 Define a set of functions; in practice used to compare
two function sizes.
 The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.

asymp - 12

O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }
Intuitively: Set of all functions
whose rate of growth is the same as
or lower than that of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = (g(n))  f(n) = O(g(n)).
(g(n))  O(g(n)).
asymp - 13
Examples
O(g(n)) = {f(n) :  positive constants c and n0,
such that n  n0, we have 0  f(n)  cg(n) }

 Any linear function an + b is in O(n2). How?


 Show that 3n3=O(n4) for appropriate c and n0.

asymp - 14

 -notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  cg(n)  f(n)}
Intuitively: Set of all functions
whose rate of growth is the same
as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = (g(n))  f(n) = (g(n)).
(g(n))  (g(n)).
asymp - 15
Example
(g(n)) = {f(n) :  positive constants c and n0, such
that n  n0, we have 0  cg(n)  f(n)}

 n3 = (n2). Choose c and n0.

asymp - 16

-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).

g(n) is an asymptotically tight bound for f(n).

asymp - 17
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)
}
Technically, f(n)  (g(n)).
Older usage, f(n) = (g(n)).
I’ll accept either…

f(n) and g(n) are nonnegative, for large n.


asymp - 18

Example
(g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n)}

 10n2 - 3n = (n2)
 What constants for n0, c1, and c2 will work?
 Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
 To compare orders of growth, look at the
leading term.
 Exercise: Prove that n2/2-3n= (n2)
asymp - 19
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

 Both algorithms are of the same order: O(N)

asymp - 20 20

Example (cont’d)

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)

asymp - 21 21
Relations Between , O, 

RR

O( f ) ( f )
•f
( f )

asymp - 22

Relations Between , , O
Theorem : For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).

 I.e., (g(n)) = O(g(n))  (g(n))

 In practice, asymptotically tight bounds are


obtained from asymptotic upper and lower bounds.

asymp - 23
Asymptotic Notation in Equations
 Can use asymptotic notation in equations to
replace expressions containing lower-order terms.
 For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3). How to interpret?
 In equations, (f(n)) always stands for an
anonymous function g(n)  (f(n))
 In the example above, (n2) stands for
3n2 + 2n + 1.

asymp - 24

o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  f(n) < cg(n)}.

f(n) becomes insignificant relative to g(n) as n


approaches infinity:
lim [f(n) / g(n)] = 0
n

g(n) is an upper bound for f(n) that is not


asymptotically tight.
Observe the difference in this definition from previous
ones. Why?
asymp - 25
w -notation
For a given function g(n), the set little-omega:

w(g(n)) = {f(n):  c > 0,  n0 > 0 such that


 n  n0, we have 0  cg(n) < f(n)}.
f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] = .
n

g(n) is a lower bound for f(n) that is not


asymptotically tight.

asymp - 26

Comparison of Functions
fg  ab

f (n) = O(g(n))  a  b
f (n) = (g(n))  a  b
f (n) = (g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b

asymp - 27
Properties
 Symmetry
f(n) = (g(n)) iff g(n) = (f(n))

 Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = w((f(n))

asymp - 28

Вам также может понравиться