Вы находитесь на странице: 1из 5

5 Complexity

We have already used the O notation to denote the general behaviour of an


algorithm as a fun tion of the problem size. We have said that an algorithm is
O(log ) if its running time, ( ), to solve a problem of size n is proportional
to log .
n

T n

5.1

The O notation

Formally, ( ( )) is the set of fun tions, , su h that for some 0,


() ()
for all positive integers,
, ie for all su iently large . Another way of
writing this is:
( )
lim
!1 ( )
Informally, we say the ( ) is the set of all fun tions whi h grow no faster than
. The fun tion is an upper bound to fun tions in ( ).
We are interested in the set of fun tions de ned by the notation be ause we
want to argue about the relative merits of algorithms - independent of their
implementations. That is, we are not on erned with the language or ma hine
used; we want a means of omparing algorithms whi h is relevant to any implementation.
We an de ne two other fun tions:
( ) and ( ) .

( ) the set of fun tions ( ) for whi h ( )  ( ) for all positive integers,
, and
O g n

f n

>

< g n

n > N

f n

g n

O g

O g

f n

f n

g n

n > N

( ) =
( ) \ ( )
g

O g

We an derive:
f

2 (g )

if

()
lim
!1 ( ) =
Thus,
( ) is a lower bound - fun tions in
( ) grow faster than and ( )
are fun tions that grow at the same rate as . In these last two statements as in most of the dis ussion on omplexity theory - "within a onstant fa tor"
is understood. Di erent languages, ompilers, ma hines, operating systems, et
will produ e di erent onstant fa tors: it is the general behaviour of the running
time as in reases to very large values that we're on erned with.
f n

g n

5.2

Properties of the O notation

The following general properties of O notation expressions may be derived:


1. Constant fa tors may be ignored:
For all 0, kf is O( ).
e.g.
and are both O( ).
2. Higher powers of n grow faster than lower powers:
is O( ) if 0   .
3. The growth rate of a sum of terms is the growth rate of its fastest growing
term:
If f is O( ), then f + g is O( ).
e.g.
+ is O( ).
4. The growth rate of a polynomial is given by the growth rate of its leading
term ( f. (2), (3)):
If f is a polynomial of degree d, then f is O( ).
5. If f grows faster than g, whi h grows faster than h, then f grows faster
than h.
6. The produ t of upper bounds of fun tions gives an upper bound for the
produ t of the fun tions:
If f is O( ) and h is O( ), then is O( )
e.g. if f is O( ) and g is O(log ), then fg is O( log ).
7. Exponential fun tions grow faster than powers:
is O( ), for all 1  0,
e.g. is O(2 ) and is O( ( )).
8. Logarithms grow more slowly than powers:
log is O( ) for all 1 0
e.g. log is O( ).
9. All logarithms grow at the same rate:
log is (log ) for all
1.
10. The sum of the rst n powers grows as the ( + 1) power:
k >

an

bn

an

bn

b >

exp n

b >

;k >

0:5

2n

b n

;k

gr

fh

b; d >

th

e.g.

(

(n+1)n
is
2

k is

r +1

k=1

n
k=1

( ))
2

th

5.3

Polynomial and Intra table Algorithms

5.3.1 Polynomial time omplexity

An algorithm is said to have polynomial time omplexity i it is O( ) for some


integer d.
d

5.3.2 Intra table Algorithms

A problem is said to be intra table if no algorithm with polynomial time omplexity is known for it. We will brie y examine some intra table problems in a
later se tion.

5.4

Analysing an algorithm

5.4.1 Simple Statement Sequen e

First note that a sequen e of statements whi h is exe uted on e only is O(1).
It doesn't matter how many statements are in the sequen e - only that the
number of statements (or the time that they take to exe ute) is onstant for all
problems.

5.4.2 Simple Loops


If a problem of size n an be solved with a simple loop:
for(i=0;i<n;i++)
f s; g

where s is an O(1) sequen e of statements, then the time omplexity is nO(1)


or O( ).
If we have two nested loops:
for(j=0;j<n;j++)
for(i=0;i<n;i++)
f s; g
n

then we have n repetitions of an O( ) sequen e, giving a omplexity of: nO( )


or O( ).
Where the index 'jumps' by an in reasing amount in ea h iteration, we might
have a loop like:
h = 1;
while( h  n )
f s;
h = 2*h; g
in whi h h takes values 1, 2, 4, ... until it ex eeds n. This sequen e has
1 + blog values, so the omplexity is O(log ).
If the inner loop depends on an outer loop index:
n

2n

2n

for(j=0;j<n;j++)
for(i=0;i<j;i++)
{ s; }

gets exe uted i times, so the total is:


X = ( + 1))
2
and the omplexity is O( ). We see that this is the same as the result for
two nested loops above, so the variable number of iterations of the inner loop
doesn't a e t the `big pi ture'.

The inner loop for(i=0;

..

n n

However, if the number of iterations of one of the loops de reases by a onstant


fa tor with every iteration:
h = n;
while( h > 0 )
{
for(i=0;i<n;i++)
{ s; }
h = h/2;
}

Then

there are log iterations of the outer loop and


 the inner loop is O( ),
so the overall omplexity is O( log ). This is substantially better than the
previous ase in whi h the number of iterations of one of the loops de reased by
a onstant for ea h iteration!


2n

John Morris, 1996