Вы находитесь на странице: 1из 65

Asymptotic Analysis

Asymptotic Analysis
Topics
Asymptotic notation
Using Basic Methods
Using Limits
Analysis of summation

What is Complexity?
The level in difficulty in solving mathematically posed problems as measured by
The time (time complexity)
Number of steps or arithmetic operations (computational
complexity)
Memory space required (space complexity)
Major Factors in Algorithms Design
1. Correctness
An algorithm is said to be correct if For every input, it halts with correct output. An
incorrect algorithm might not halt at all OR It might halt with an answer other than
desired one. Correct algorithm solves a computational problem.
2. Algorithm Efficiency
Measuring efficiency of an algorithm, do its analysis i.e. growth rate. Compare
efficiencies of different algorithms for the same problem.

Algorithms Growth Rate


Algorithm Growth Rates
It measures algorithm efficiency
What means by efficient?
If running time is bounded by polynomial in the input
Notations for Asymptotic performance
How running time increases with input size
O, Omega, Theta, etc. for asymptotic running time
These notations defined in terms of functions whose domains are natural numbers
convenient for worst case running time
Algorithms, asymptotically efficient best choice

Complexity Analysis
Algorithm analysis means predicting resources such as
computational time
memory
computer hardware etc
Worst case analysis
Provides an upper bound on running time
Average case analysis
Provides the expected running time
Very useful, but treat with care: what is average?
Random (equally likely) inputs
Real-life inputs

Worst-case Analysis
Let us suppose that
Dn = set of inputs of size n for the problem
I = an element of Dn.
t(I) = number of basic operations performed on I
Define a function W by
W(n) = max{t(I) | I Dn}
called the worst-case complexity of the algorithm
W(n) is the maximum number of basic operations performed by the
algorithm on any input of size n.
Please note that the input, I, for which an algorithm behaves worst depends
on the particular algorithm.

Average Complexity
Let Pr (I) be the probability that input I occurs.
Then the average behavior of the algorithm is defined as
A(n) = Pr (I) t(I),

summation over all I Dn

We determine t(I) by analyzing the algorithm, but Pr(I) cannot be computed


analytically.
Average cost =A(n) = Pr(succ) Asucc(n) + Pr(fail) Afail(n)
An element I in Dn may be thought as a set or equivalence class that affect
the behavior of the algorithm
Worst Analysis computing average cost
Take all possible inputs, compute their cost, take average

Asymptotic Notations Properties


Categorize algorithms based on asymptotic growth rate e.g. linear, quadratic,
polynomial, exponential
Ignore small constant and small inputs
Estimate upper bound and lower bound on growth rate of time complexity
function
Describe running time of algorithm as n grows to .
Describes behavior of function within the limit.
Limitations
not always useful for analysis on fixed-size inputs.
All results are for sufficiently large inputs.

Asymptotic Notations
Asymptotic Notations , O, , o,
We use to mean order exactly,
O to mean order at most,
to mean order at least,
o to mean tight upper bound,
to mean tight lower bound,
Define a set of functions: which is in practice used to
compare two function sizes.

Asymptotic Analysis
Objective
The purpose of asymptotic analysis is to examine the behavior of an algorithm for
large input size. More specifically, if T(n) is the running time for an input of size
n , we would want to know the behavior or growth rate of T(n) for very large
values of n. An analysis of algorithm for large input is referred to as asymptotic
analysis.
The asymptotic behavior of an algorithm is often compared to some standard
mathematical function, such as n2, n lg n etc The relationship or similarity of
behavior is often expressed by a special notation which is called asymptotic
notation.
The standard asymptotic notations commonly used in the analysis of algorithms
are known as O (Big Oh), (Big Omega), and (Theta).
Sometimes, additional notations o( small-oh) and ( small-omega) are also used.

Asymptotic Notation

O-Notation
Definition

If f(n) is running time of an algorithm, and g(n) is some standard growth function
such that for some positive constants c and n0 ,
0 < f(n) c.g(n) for all n n0
then

f(n) = O(g(n)) (Read f(n) is Big-Oh of g(n) )

The behavior of f(n) and g(n) is portrayed in the diagram. It follows, that for n<n0, f(n) may
be above or below g(n), but for all n n0, f(n) falls consistently below g(n). The function g(n)
is said to be asymptotic upper bound for f(n)

O-Notation
Example
Using basic definition we show that 3n2 + 10n =
Consider,

10 n

for

n 10

10n
n2
both sides with n )
3n2+10n
2 + 10 n c.n2
3n
both
sides )

( obvious !)
for n 10

3n2 +10n

O(n2)

( Multiplying

4n2 for n 10
(Adding 3n2 to
for n n0 ( c=4 and n0=10 )

Therefore, it follows from the basic definition that

O(n2)

O-Notation
Example contd

The preceding result shows that


3n2 + 10 n c. n2,
for c=4 and n 10

However, the choice of c is not unique. We can select some other c and corresponding n0 so that the
relation still holds.
Consider,
3n2 + 10n 3n2 + 10 n2 for n 1 ( Replacing n with n2 on the right
side )
=13 n2
3n2 +10n c.n2,
for n for
n0n
, ( 1c=13 and n0=1)
The graph depicts the two solutions. Observe that both 13n2 and 4n2 eventually grow faster than
3n2+10
g(n)= cn2
c=13

g(n)=cn2
c=4

f(n)=3n2 + 10 n

n0=10
n0=1

O-Notation
Set Builder

There can be a several functions , say f1(n), f2(n), f3(n), ..for which g(n) is the upper
bound. All such functions are said belong to a class identified by O(g(n)), as shown below.
f1(n) c1. g(n) for n>n1
f2(n) c2. g(n) for n>n2
f3(n) c3 g(n) for n>n3
O( g(n) )= { f1(n) , f2(n),
f3(n)}

Using set-builder notation the relationship


is expressed as

Symbolically,
O(g(n))
O(g(n)) = { f(n): there exist positive
f(n)
constants
c and n0, 0 f(n) c.g(n), for all
n n0 }

-Notation
Definition

If f(n) is running time of an algorithm, and g(n) is some standard growth function
such that for some positive constants c, n0 ,
0 < cg(n) f(n)
for all nn0
then
f(n) = (g(n)) (Read f(n) is Big-Omega of g(n) )
The behavior of f(n) and g(n) is portrayed in the graph. It follows, that for n<n0, f(n) may be
above or below g(n), but for all n n0, f(n) falls consistently above g(n). The function g(n) is
said to be asymptotic lower bound for f(n)

-Notation
We show that

Example-1
n2 -10n = (n2).

For, n
n/
for n 0
( Obvious !)
2 n - 10 n / (2 x 10) for n 10 ( Divide right side by 10)
= n / 20
n2 10n n2 / 20 for n 10 ( Multiply both sides with n to maintain inequality )
n2 10n c. n2 for n n0, where c=1 / 20 and n0=10
Therefore,

n2 -10n = (n2).

The behavior of functions n2-10n and n2 / 20 is shown in the graph. Observe that for
n10, the function n2 / 20 falls below the function n2-10n
f(n)=n2 -10n

n0=10
g(n)=c.n2, c = 1/20
Lower asymptotic bound

-Notation
Example-2
Next we show that 3n2 -25n = (n2).
n
n / 2 for n
( Obvious !)
0 n- 25/3 3 n / (2 x 25) for n 9 ( Divide right side by 25/3 8.3)
for n 9
= 3 n / 50
3n2 25n 9n2 / 50 for n
( Multiply both sides with 3n to maintain inequality )
9 2 25 n c. n2 for n n0, where c=9 / 50 and n0=9
3n
Therefore,
3n2 -25n=(n2).
The behavior of functions 3n2-25n and 9n2 / 50 is shown in the graph. Observe that for
n9, the function 9n2 / 50 falls below the function 3n2-25n

f(n)=3n2 -25n

g(n)=c.n2, c= 9/50

n0=9
Lower asymptotic bound

-Notation
Set Builder

There can be a several functions , say f1(n), f2(n), f3(n).., for which g(n) is the lower
bound. All such functions are said to belong to a classs identified by (g(n)).
f1(n) c1. g(n) for n>n1
f2(n) c2. g(n) for n>n2
f3(n) c3 g(n) for n>n3
( g(n) )= { f1(n) , f2(n),
f3(n)}
Using set-builder notation the relationship is expressed as
(g(n)) = { f(n): there exist positive constants c and n0, f(n) c.g(n), for all n n0 }
Symbolically,
f(n)

(g(n))

-Notation
Definition

If f(n) is running time of an algorithm, and g(n) is some standard growth


such
that for some positive constant sc1 , c2 and n
function
0,
0 < c2.g(n)
f(n) c1.g(n) for all nn0
then
f(n) = (g(n)) (Read f(n) is theta of g(n) )
The behavior of f(n) and g(n) is portrayed in the graph. It follows, that for n < n0, f(n) may
be above or below g(n), but for all n n0, f(n) falls consistently between c1.g(n) and c2.g(n).
Function g(n) is said to be the asymptotic tight bound for f(n)

There can be a several functions for which the g(n) is asymptotic tight bound. All such
functions are said belong to the class identified by g(n). Symbolically, we denote the
relationship as f(n) (g(n))

-Notation
We show that 5n2 -19n = (n2).
Consider the upper bound,
5n2 19n
5n2 for n 0
5n2 19 n c1. n2
n1=0

Example

for n n1, where c1=5 and

Next, consider the lower bound,


n
n/2
( Obvious !)
for n 0
n- 19/5 5 n / (2 x
for n 4 ( Divide right side by 19/5= 3.8)
19)
= 5 n / 38
for n 4
5n2 19n 25n2 / 38 for n 4

( Multiply both sides with 5n )

5n2 19 n c2. n2 for n n2, where c2=25 / 38 and n2=4


It follows, 0 < c2.n2 5n2 - 19n c1.n2
Therefore, 5n2 -19n = (n2)

for n n0 , where n0=4, c1=5 and c2=25/38.

-Notation
Example-contd

The relation 5n2 -19n = (n2) is illustrated in the figure below. The graph shows the
asymptotic upper and lower bounds of the function f(n)= 5n2-19 n

Upper bound

g(n)=c1n2
c1=5

f
(
n
)
=
2
g(n)=c
5 2.n
c2=25/38
n

n0=4

Lower bound

1
9
n

Asymptotic Notation
Constant Running Time

If running time T(n)=c is a constant, i.e independent of input size, then , by convention,
asymptotic behavior is denoted by the notation
O(c) =

O(1) ,

(c) = (1), (c) = (1)

The convention implies that the running time of an algorithm ,which does not depend
on the size of input, can be expressed in any of the above ways.
If c is constant then using basic definition it can be shown that
O( c.f(n) )

= O(f(n))

( c.f(n) )

= (f(n))

( c.f(n) )

= (f(n))

The above relations imply that in asymptotic notation the multiplier constants can be
ignored
For example, O(1000n)=O(n), (7lgn ) = (lg), (100n!)= (n!)

Asymptotic Notation
If f(n) = ( g(n) ) then
f(n) = ( g(n) ),

, O, Relationship

and f(n)=

Conversely, if f(n) = ( g(n) )


then f(n) = ( g(n) )

O( g(n) )
and f(n) =
O( g(n) )

The above properties follow directly from the basic definitions


Example(1): Since, n(n-1)/2 = (n2 ), therefore it follows that
n(n-1)/2 = (n2 )
n(n-1)/2 = O(n2 )
Example(2): It can be shown that
5n2+1
and

= (n2 )
5n2+1 = O(n2 )

Therefore, 5n2 + 1 = (n2 )

Asymptotic Notation
Relationship
Asymptotic Analysis/QAU2008/Dr.A.Sattar/18

Asymptotic Set Notation


Example

The relationship among the O-,-,-Notation can be expressed using set notation
Consider, for example, the following sets of growth functions

SO(n )
2

{ f(n): n, n+5, lg n+4n, n1.5+n, n+5n2, n2+5n, lg n+4n2, n1.5+3n2 }

where SO(n2) is a set of functions f(n)


O(n2)
S 2 = { f(n): n+5n2, n2+5n, lg n+4n2, n1.5+3n2 , 5n2+n3, n3 +n2+n, lg n+4n4, nlg n+3n4 }
(n )

where S(n2) is a set of functions f(n)


(n2)
S(n2) = f(n):{ n+5n2 , n2+5n, lg n+4n2, n1.5+3n2 }
where S(n2) is a set of functions f(n)
It follows that

(n )
2

S(n ) = SO(n ) S(n )


2

Asymptotic Notation
Order Theorem
Theorem: If f1(n) = O( g1(n) ) and f2(n) = O( g2(n) ) then
f1(n) + f2(n)=

O( max( g1(n) , g2(n) )

Proof: By definition,
f1(n) c1. g1(n)

for n n1

f2(n) c2. g2(n)

for n n2

Let n0 = max( n1, n2)


f1(n) c3. g1(n)
f2(n) c3 g2(n)

c3=max(c1, c2)
for n n0

for n n0

f1(n) + f2(n) c .g (n) + c . g (n) )


for n n0
3 1
3
2
Let h(n) = max( g1(n) , g2(n) )
f1(n) + f2(n) 2c3.h(n) = c. h(n) where c=2c3

for n n0

f1(n) + f2(n) c. h(n) = c. max( g (n) , g (n) )


1
2

for n n0

Therefore, f1(n) + f2(n) = O( max( g1(n) , g2(n) )


The theorem also applies to and notations

Asymptotic Notation
The relation

Using Order Theorem

f1(n) + f2(n)= O( max( g1(n) , g2(n) )


implies that in a summation, the lower order growth function can be discarded in favor
of the highest ranking function
In general,
f1(n)+f2(n)+f3(n) .+fk(n)= O(max(g1(n),g2(n),g3(n),gk(n))
Example: Consider the summation f(n) consisting of basic functions :
f(n) = n +n + n1.5+ lg n + n lg n + (lg n)2 + n2
We have seen that

lg n < (lgn)2 < n < n <n lg n <n1.5 < n2

The function n2 grows faster than all other functions in the expression
Thus, O( max( n + n + n1.5+ lg n + n lg n + (lg n)2 + n2 ) ) =O( n2)

Using Limits

Asymptotic Analysis
Using Limits

Use of basic definition for determining the asymptotic behavior is often awkward. It
involves ad hoc approach or some kind of manipulation to prove algebraic relations.
Calculus provides an alternative method for the analysis. It depends on evaluating
the following limit.
f(n)
lim
=
g(n)
n
where f(n) is a given growth function for an algorithm and g(n) is a standard function
Depending upon the limit , the relation between f(n) and g(n) can be expressed in terms
of asymptotic notations
It will be seen that the condition n is equivalent to the condition all n n0 in the
basic definition of asymptotic notation. Either of these conditions implies large input
Use of limits simplifies the asymptotic analysis.

O-Notation
Using Limit

If f(n) is running time of an algorithm and g(n) is some standard growth function such
that
f(n)
where 0 c < (infinity is excluded)
lim
=
c,
n

+ 5n+ 20 = O(n2)
Example(1): 3n2 g(n)
2 + 5n + 20
then f(n) = 3n
O(g(n)).
lim
=3+ 5 / n +20 / n2 = ( 3+0+0)=3
n
n2

Therefore,
3n2 + 5n+ 20 = O(n2)

Example(2): 10n2 + 25n+ 7 = O(n3)


10n2 + 25n + 7
lim
=10 / n+ 25 / n2 + 7 / n3 = (0+0+0)=0
n
n3

Therefore, 10n2 + 25n+ 7 = O(n3)

O-Notation
Examples contd
Example(3): lg n = O(n)
In order to compute differential of lg n we first convert binary logarithm to natural logarithm.
Converting lg n (binary ) to ln n ( natural), by the using formula lg n = ln n / ln 2
lg n
lim
(ln n)
n n = (ln 2)n
1
= /
lim
= 0 ( Differentiating the numerator and denominator)
n ln 2.
n
Therefore,
lg n =
O(n)

Example(4):

n2 = O(2n)

lim
n2 = /
n 2n
2n

lim
n ln 2 2n
lim
n

Therefore,

2
(ln2)2. 2n
n2

= O(2n)

(Differentiating the numerator and denominator)

= / (Differentiating again the numerator and denominator)

=0

-Notation
Using Limit

If f(n) is running time of an algorithm and g(n) is some standard growth function such
that
f(n)
where 0 < c (zero is excluded)
lim
=
c,
n

+ 12n+8= (n2)
Example(1): 7n2 g(n)
then f(n) = (g(n)).
7n2 + 14n + 8
lim
n
n2
Therefore, 7n2 + 14n+ 8 =

=7+ 14 / n +8 / n2 =(7 + 0 + 0)=7


(n2)

Example(2):
10n3 + = (n2)
5n+ 2
10n3 + 5n + 2
lim
=10 n+ 5 / n + 2 / n2 = + 0 + 0 =
n
n2

Therefore, 10n3 + 5n+ 2 = (n2)

-Notation
Using Limit
If f(n) is running time of an algorithm and g(n) is some standard growth function such
f(n)
that
lim
= c , where 0 < c < , ( zero and infinity is excluded )
n g(n)
then f(n)
= (g(n)).
Example(1): 45n3 - 3n2 - 5n+ 20 = (n3)
45n3 -3n2 - 5n + 20
lim
= 45-3 / n -5 / n2 +20/n3 =(45-0 -0)=45
n
n3

Therefore, 45n3 - 3n2 - 5n+ 20 = (n3)


Example(2): n lg n + n + n2 =
lim
n

n lg n + n + n2
n2

Thus, n lg n + n + n2 = (n2)

(n2)
= lg n / n +1/n +1=(0+0+1)=1

o-Notation
Definition

If f(n) is running time and g(n) is some standard growth function such that
f(n)
lim
=
0
g(n)
n
then f(n) = o(g(n)) (Read f (n) is small-oh of g(n))
Example(1):
5n+ 20 =
o(n2) For,
5n + 20
lim
=5 / n +20 / n2 = ( 0+0)=0
n
n2

Therefore,
5n+ 20 = o(n2)
Example(2): 10n2 + 25n+ 7 = o(n3)
10n2 + 25n + 7
lim
=10 / n+ 25 / n2 + 7 / n3 = (0+0+0)=0
n
n3
lg n = o(n), because
Example(3):
lim
n

lg n
n

=0

( Using LHopital Rule to compute the limit)

-Notation
Definition

If f(n) is running time and g(n) is some standard growth function such that
f(n)
lim
=

g(n)
n
then f(n) = (g(n)) (Read f(n) is small-omega of g(n))
Example(1):
5n3+ 20n2+n+10 = (n2), For
5n3 + 20n2+n+10
lim
=5n +20+ 1/ n+10/n2 = (+20+0+0)=
n
n2

Example(2): 10n2 + 25n+ 7 = (n), because


10n2 + 25n + 7
lim
= n+ 25 + 7 / n = (+25+0)=
n
n

Example(3): n! =(2n) , because


lim
n

n!
2n

(Using Stirlings formula for n!)

Asymptotic Notation
Summary

f(n)
lim

=
Let f(n) be time complexity and g(n) standard function, such that
g(n)
n
Table below summarizes the asymptotic behavior of f(n) in terms of g(n)
Notation

Using Basic Definition

Using Limits

Asymptotic Bound

f(n)=O( g(n) )

f(n) c.g(n) for some c>0, and n n0

tight upper

f(n)=o( g(n) )

f(n) <c.g(n) for all c>0, and n n0

f(n)=( g(n) )

f(n) c.g(n) for some c>0 and n n0

f(n)=( g(n) )

f(n) > c.g(n) for all c>0 and n n0

f(n)=( g(n) )

c1.g f(n) c2.g(n) for some c1>0, c2>0


and n n0

<

=0
0<

loose upper

=
0< <

tight lower
loose lower
tight

Analysis of Summations

Arithmetic Summation
Asymptotic Behavior
The sum of first of n terms of arithmetic series is :
1 + 2 + 3..+n = n(n+1)/2
Let f(n)= n(n+1)/2
and g(n)= n2
lim
n

n2 /2 + n/2

n(n+1)/2

f(n)
=

g(n)

=
n2

Since the limit is non-zero and finite it follows


f(n) = ( g(n)) = (n2)
Or,

1 + 2 + ..+ n = (n2)

n2

= 1 /2 + 1/2n =1/2 +
0=1/2

Geometric Summation
Asymptotic Behavior

The asymptotic behavior of geometric series


1 +r + r2+..+rn
depends on geometric ratio r. Three cases need to be considered

Case r > 1: It can be shown that sum f(n) of first n terms is as follows
- 1
rn+1
f(n) = 1 +r + r2+..+rn =
r - 1
Case r = 1: This is trivial
f(n)= 1 + 1 + 1+ +1 = n =(n)

Case r < 1: It can be shown that


f(n)

n+1
= 1 +r + r2+..+rn 1=- r
1-r

The asymptotic behavior in first case and third case is explored by computing
limits.

Geometric Summation
Case r >
1

- 1
Let f(n) = 1 +r + r2+..+rn =
rn+1
r - 1
Let g(n)= rn
Consider, the limit
f(n)

lim
n

g(n)

rn+1

- 1

(r - 1).r

r - 1/rn
(r - 1)

Since r>1 , 1/rn 0 as n


lim
n

f(n)
g(n)
(rn)

(r - 1)

>0, since r > 1

for r>1

The
refore, f(n) =
Or, 1 +r + r2+..+rn =

(rn)

for r > 1

Geometric Summation
Consider 1 +r + r2+..+rn

Case r <
- 1
=1
rn+1
r - 1

Let g(n)=c where c is some positive constant


Taking the limit
lim
n

f(n)

rn+1

g(n)

- 1

(r - 1).c

1- rn+1

(1- r )c

Since r<1 , rn+1 0 as n


lim
n

f(n)
g(n)

>0, since r < 1

(1 - r)c

(g(n))=(c) = (1)

Th
Or, 1 +r
+ r2+..+rn
erefore,
f(n)=
To sum up,
1+r+r2+.........+ rn =

=(1)

for r <1

for r <1

(rn) when

r>1

(1) when

r<1

Logarithm Summation
Asymptotic Behavior

The logarithmic series has the summation


lg(1)+lg(2)+ lg(3)+.. +lg(n)
Let f(n) = lg(2)+ lg(3)+.+.lg(n)
lg( n!)
and g(n) = n lg n
lim
n

f(n)
g(n) =

lg n!
n lg n =

= lg(2.3..n) =

lg ( (2n)(n/e)n )

(using Stirlings approximation)

n lg n

Now, lg ( (2n)(n/e)n ) = (1+lg + lg n)/2 + n lg n - n lg e , therefore


n
lim lg ( (2n)(n/e) )
n
n lg n

= ( 1+lg + lg n ) /(2 nlg n) + 1 - lg e / lg n = (0+ 1- 0)=1

Since limit is non-zero and finite, it follows


f(n)= (g(n))
Or, lg(1)+lg(2)+ lg(3)+.. +lg(n) =

(n lg n)

Harmonic Summation
Asymptotic Behavior

The sum of first n terms of Harmonic series is


1+ 1/2 + 1/3+..+1/n
Let
and

f(n) = 1+ 1/2 + 1/3+..+1/n


g(n) = lg n
1+1/2+1/3+ ..+1/n

lim

f(n)

lim
n

lg n + + 1/2n 1/12n2 +
f(n)
= 1+0+0-0+0+=1
g(n) =
lg n

lg n
n
It can be shown
thatg(n)
1+ 1/2+ 1/3++1/n = lg (n) + + 1/2n 1/12n2+.. where 0.5772

Since limit is finite and non-zero, it follows


f(n) = ( g(n))
Or, 1+ 1/2 + 1/3+..+1/n =

(lg n )

Reflexive Relation
Definition:
Let X be a non-empty set and R is a relation over X then R is said to be reflexive if
(a, a) R, a X,
Example 1:
Let G be a graph. Let us define a relation R over G as if node x is connected to y then (x,
y) G. Reflexivity is satisfied over G if for every node there is a self loop.
Example 2:
Let P be a set of all persons, and S be a relation over P such that if (x, y) S then x has
same birthday as y.
Of course this relation is reflexive because
(x, x) S,
a P,

Reflexivity Relations over , , O


Example 1
Since, 0 f(n) cf(n)
n n0 = 1, if c = 1
Hence f(n) = O(f(n))
Example 2
Since, 0 cf(n) f(n)
n n0 = 1, if c = 1
Hence f(n) = (f(n))
Example 3
Since, 0 c1f(n) f(n) c2f(n)
n n0 = 1,if c1= c2 = 1
Hence f(n) = (f(n))
Note: All the relations, , , O, are reflexive

Little o and are not Reflexivity Relations


Example
As we can not prove that f(n) < f(n), for any n, and for all c > 0

1.
2.

Therefore
f(n) o(f(n)) and
f(n) (f(n))

Note :
Hence small o and small omega are not reflexive relations

Symmetry
Definition:
Let X be a non-empty set and R is a relation over X then R is said to be symmetric if
a, b X, (a, b) R (b, a) R
Example 1:
Let P be a set of persons, and S be a relation over P such that if (x, y) S then x has the
same sign as y.
This relation is symmetric because
(x, y) S (y, x) S
Example 2:
Let P be a set of all persons, and B be a relation over P such that if (x, y) B then x is
brother of y.
This relation is not symmetric because
(Anwer, Sadia) B (Saida, Brother) B

Symmetry
over

Property : prove that


f(n) = (g(n)) g(n) = (f(n))
Proof
Since f(n) = (g(n)) i.e. f(n) (g(n))
constants c1, c2 > 0 and n0 N such that
0 c1g(n) f(n) c2g(n) n n0
(1)
(1) 0 c1g(n) f(n) c2g(n) 0 f(n) c2g(n)
0 (1/c2)f(n) g(n)
(2)
(1) 0 c1g(n) f(n) c2g(n) 0 c1g(n) f(n)
0 g(n) (1/c1)f(n)
(3)
From (2),(3): 0 (1/c2)f(n) g(n) 0 g(n) (1/c1)f(n)
0 (1/c2)f(n) g(n) (1/c1)f(n)
Suppose that 1/c2 = c3, and 1/c1 = c4,
Now the above equation implies that
0 c3f(n) g(n) c4f(n), n n0
g(n) = (f(n)), n n0
Hence it proves that,
f(n) = (g(n)) g(n) = (f(n))
Exercise:
prove that big O, big omega , little , and little o, do not satisfy the symmetry
property.

Transitivity
Definition:
Let X be a non-empty set and R is a relation over X then R is said to be
transitive if
a, b, c X, (a, b) R (b, c) R (a, c) R
Example 1:
Let P be a set of all persons, and B be a relation over P such that if (x, y) B
then x is brother of y.
This relation is transitive this is because
(x, y) B (y, z) B (x, z) B
Example 2:
Let P be a set of all persons, and F be a relation over P such that if (x, y) F
then x is father of y.
Of course this relation is not a transitive because if (x, y) F (y, z) F
(x, z) F

Transitivity Relation over , , O,


o and
Prove the following
1.
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
2.
f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))
3.
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
4.
f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))
5.
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
Note
It is to be noted that all these algorithms complexity measuring notations are in fact
relations which satisfy the transitive property.

Transitivity Relation over

Property 1
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
Proof
Since f(n) = (g(n)) i.e. f(n) (g(n))
constants c1, c2 > 0 and n01 N such that
0 c1g(n) f(n) c2g(n) n n01
(1)
2.
Now since g(n) = (h(n)) i.e. g(n) (h(n))
constants c3, c4 > 0 and n02 N such that
0 c3h(n) g(n) c4h(n)
n n02
(2)
3.
Now let us suppose that n0 = max (n01, n02)

Transitivity Relation
over
Now we have to show that f(n) = (h(n)) i.e. we have to prove that
constants c5, c6 > 0 and n0 N such that
0 c5h(n) f(n) c6h(n)?
(2) 0 c3h(n) g(n) c4h(n)
0 c3h(n) g(n)
(3)
0 c1g(n) f(n) c2g(n)
0 c1g(n) f(n)
0 g(n) (1/c1)f(n)
(4)
From (3) and (4), 0 c3h(n) g(n) (1/c1)f(n)
0 c1c3h(n) f(n)
(5)
4.

Transitivity Relation over


0 c1g(n) f(n) c2g(n)
0 f(n) c2g(n) 0 (1/c2)f(n) g(n) (6)
(2) 0 c3h(n) g(n) c4h(n)
0 g(n) c4h(n)
(7)
From (6) and (7), 0 (1/c2)f(n) g(n) (c4)h(n)
0 (1/c2)f(n) (c4)h(n)
0 f(n) c2c4h(n)
(8)
From (5), (8), 0 c1c3h(n) f(n) 0 f(n) c2c4h(n)
0 c1c3h(n) f(n) c2c4h(n)
0 c5h(n) f(n) c6h(n)
And hence f(n) = (h(n))
n n0
(1)

Transitivity Relation over


Property 2
Big
O
f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))
Proof
Since f(n) = O(g(n)) i.e. f(n) O(g(n))
constants c1 > 0 and n01 N such that
0 f(n) c1g(n)
n n01
(1)
2.
Now since g(n) = O(h(n)) i.e. g(n) O(h(n))
constants c2 > 0 and n02 N such that
0 g(n) c2h(n)
n n02
(2)
3.
Now let us suppose that n0 = max (n01, n02)
Now we have to two equations
0 f(n) c1g(n)
n n01
(1)
0 g(n) c2h(n)
n n02
(2)
(2) 0 c1g(n) c1c2h(n) n n02
(3)
From (1) and (3)
0 f(n) c1g(n) c1c2h(n)
Now suppose that c3= c1c2
0 f(n) c1c2h(n)
And hence f(n) = O(h(n))

n n0

Transitivity Relation over


Property 3
Big f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
Proof
Since f(n) = (g(n))
constants c1 > 0 and n01 N such that
0 c1g(n) f(n)
n n01
(1)
2.
Now since g(n) = (h(n))
constants c2 > 0 and n02 N such that
0 c2h(n) g(n)
n n02
(2)
3.
Suppose that n0 = max (n01, n02)
4.
We have to show that f(n) = (h(n)) i.e. we have to prove that
constants c3 > 0 and n0 N such that
0 c3h(n) f(n)
n n0 ?
(2) 0 c2h(n) g(n)
(1) 0 c1g(n) f(n)
0 g(n) (1/c1)f(n)
(3)
From (2) and (3), 0 c2h(n) g(n) (1/c1)f(n)
0 c1c2h(n) f(n)
hence f(n) = (h(n)), n n0

Transitivity Relation over little


4
oProperty
f(n) = o(g(n)) & g(n) = o(h(n)) f(n) = o(h(n))
Proof
Since f(n) = o(g(n)) i.e. f(n) o(g(n))
constants c1 > 0 and n01 N such that
0 f(n) < c1g(n)
n n01
(1)
2.
Now since g(n) = o(h(n)) i.e. g(n) o(h(n))
constants c2 > 0 and n02 N such that
0 g(n) < c2h(n)
n n02
(2)
3.
Now let us suppose that n0 = max (n01, n02)
Now we have to two equations
0 f(n) < c1g(n)
n n01
(1)
0 g(n) < c2h(n)
n n01
(2)
(2) 0 c1g(n) < c1c2h(n) n n02
(3)
From (1) and (3)
0 f(n) c1g(n) < c1c2h(n)
Now suppose that c3= c1c2
0 f(n) < c1c2h(n)
And hence f(n) = o(h(n))

n n01

Transitivity Relation over


Property 5
little
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
Proof
Since f(n) = (g(n))
constants c1 > 0 and n01 N such that
0 c1g(n) < f(n)
n n01
(1)
2.
Now since g(n) = (h(n))
constants c2 > 0 and n02 N such that
0 c2h(n) < g(n)
n n02
(2)
3.
Suppose that n0 = max (n01, n02)
4.
We have to show that f(n) = (h(n)) i.e. we have to prove that
constants c3 > 0 and n0 N such that
0 c3h(n) f(n)
n n0 ?
(2) 0 c2h(n) < g(n)
(1) 0 c1g(n) < f(n)
0 g(n) < (1/c1)f(n)
(3)
From (2) and (3), 0 c2h(n) g(n) < (1/c1)f(n)
0 c1c2h(n) < f(n) hence f(n) = (h(n)), n n0

Transpose Symmetry
Property 1
Prove that f(n) = O(g(n)) g(n) = (f(n))
Proof
Since f(n) = O(g(n))
constants c > 0 and n0 N such that
0 f(n) cg(n)
n n0
Dividing both side by c
0 (1/c)f(n) g(n) n n0
Put 1/c = c
0 cf(n) g(n)
n n0
Hence, g(n) = (f(n))

Transpose
Symmetry
Property 2
Prove that f(n) = o(g(n)) g(n) = f(n))
Proof
Since f(n) = o(g(n))
constants c > 0 and n0 N such that
0 f(n) < cg(n)
n n0
Dividing both side by c
0 (1/c)f(n) < g(n)
n n0
Put 1/c = c
0 cf(n) < g(n)
n n0
Hence, g(n) = (f(n))

Relation between , , O
Trichotomy property over real numbers
For any two real numbers a and b, exactly one of the following must hold: a < b, a = b,
or a > b.
The asymptotic comparison of two functions f and g and the comparison of two real
numbers a and b.
Trichotomy property over , and O
1.
f (n) = O(g(n))
a b
2.
f (n) = (g(n))
a b
3.
f (n) = (g(n))
a=b
4.
f (n) = o (g(n))
a<b
5.
f (n) = (g(n))
a>b

Some Other Standard Notations


Monotonicity
monotonically increasing if m n f(m) f(n).
monotonically decreasing if m n f(m) f(n).
strictly increasing if m < n f(m) < f(n).
strictly decreasing if m < n f(m) > f(n).
Polynomials
Given a positive integer d, a polynomial in n of degree d is a function of the form given
below, ai are coefficient of polynomial.

p n ai n
i 0

Standard Logarithms Notations


Some Definitions
Exponent
x = log b a is the exponent for a = bx.
Natural log
ln a = log e a
Binary log
lg a = log2a
Square of log
lg2a = (lg a)2
Log of Log
lg lg a = lg (lg a)

Standard Logarithms Notations


ab

logb a

log c (ab) log c a log c b


log b a nlog b a
n

log c a
log b a
log c b
log b (1/a) log b a
1
log b a
log a b
a

logb c

logb a

Вам также может понравиться