Вы находитесь на странице: 1из 21

# Algorithm Analysis & Asymptotic

Notations
Asymptotic Notations
Big O, Notations

Example Problem
Problem: Given N integers stored in an array X
(int X[N]), find the sum of the numbers
Here is an iterative algorithm:
Int X[N];
Sum = 0;
For (int i = 0; i < N; i++)
Sum = Sum + X[i];
2

## Analyzing Running Time of an Algorithm

RT: the amount of time it takes for the
algorithm to finish execution on a particular
input size
More precisely, the RT of an algorithm on a
particular input is the number of primitive
operations or steps executed.
We define a step to be a unit of work that can
be executed in constant amount of time in a
machine.

## Finding the sum of a set of numbers:

Iterative Algorithm and its analysis
Assume int X[N] of integer is our data set

Cost
Times
Sum = 0;
C0
1
For (int i = 0; i < N; i++)
C1
N
Sum = Sum + X[i];

C2

## T(n) = C0 + C1*N + C2*N

Since C0, C1 and C2 are constants, T(n) can be
expressed as a linear function n, i.e.,
T(n) = a + b*n, for some constants a, b

## Another Example: Searching for a

number in an array of numbers

Assume int X[N] of integer is our data set and we are searching for key

Cost
Found = 0;
I = 0;
while (!found && i < N){
If (key ==X[I]) found = 1;
I++;

C0
C1
C2
C3
C4

Times
1
1
0 <= L < N
1 <= L <= N
1 <= L <= N

}
T(n) = C0 + C1 + L*(C2 + C3 + C4), where 1 <= L <= N is the number of
times that the loop is iterated.

## Example2: Searching for a number in

an array of numbers (continued)
Whats the best case? Loop iterates just once =>
T(n) = C0 + C1 + C2 + C3 + C4
Whats the average (expected) case? Loop iterates N/2 times =>
T(n) = C0 + C1 + N/2 * (C2 + C3 + C4)
Notice that this can be written as T(n) = a + b*n where a, b are
constants
Whats the worst case? Loop iterates N times =>
T(n) = C0 + C1 + N * (C2 + C3 + C4)
Notice that this can be written as T(n) = a + b*n where a, b are
constants
6

## Worst Case Analysis of Algorithms

We will only look at WORST CASE running time of an
algorithm. Why?
Worst case is an upper bound on the running time. It gives us a
guarantee that the algorithm will never take any longer
For some algorithms, the worst case happens fairly often. As in
this search example, the searched item is typically not in the
array, so the loop will iterate N times
The average case is often roughly as bad as the worst case.
In our search algorithm, both the average case and the worst
case are linear functions of the input size n
7

Asymptotic Notation
We will study the asymptotic efficiency of
algorithms
To do so, we look at input sizes large enough to make
only the order of growth of the running time relevant
That is, we are concerned with how the running time of
an algorithm increases with the size of the input in the
limit as the size of the input increases without bound.
Usually an algorithm that is asymptotically more
efficient will be the best choice for all but very small
inputs.

3 asymptotic notations
Big O, Notations
8

## Big-Oh Notation: Asymptotic Upper Bound

T(n) = f(n) = O(g(n))
if f(n) <= c*g(n) for all n > n0, where c & n0 are constants > 0
c*g(n)
f(n)

n0

## Algorithm A is O(g(n)) if for any reasonable implementation of the algorithm on any

reasonable computer, the time required by A to solve a problem of size n is O(g(n))

## Example: T(n) = 2n + 5 is O(n). Why?

2n+5 <= 3n, for all n >= 5
T(n) = 5*n2 + 3*n + 15 is O(n2). Why?
5*n2 + 3*n + 15 <= 6*n2, for all n >= 6

## Notation: Asymptotic Lower Bound

T(n) = f(n) = (g(n))

if f(n) >= c*g(n) for all n > n0, where c and n0 are constants > 0

f(n)
c*g(n)

n0

## Example: T(n) = 2n + 5 is (n). Why?

2n+5 >= 2n, for all n > 0
T(n) = 5*n2 - 3*n is (n2). Why?
5*n2 - 3*n >= 4*n2, for all n >= 4
10

## Notation: Asymptotic Tight Bound

T(n) = f(n) = (g(n))

if c1*g(n) <= f(n) <= c2*g(n) for all n > n0, where c1, c2 and n0 are
constants > 0

c2*g(n)
f(n)
c1*g(n)

n0

## Example: T(n) = 2n + 5 is (n). Why?

2n <= 2n+5 <= 3n, for all n >= 5
T(n) = 5*n2 - 3*n is (n2). Why?
4*n2 <= 5*n2 - 3*n <= 5*n2, for all n >= 4
11

## Big-Oh, Theta, Omega

Think of O(f(N)) as less than or equal to f(N)
Upper bound: grows slower than or same rate as f(N)

## Think of (f(N)) as greater than or equal to f(N)

Lower bound: grows faster than or same rate as f(N)

## Think of (f(N)) as equal to f(N)

Tight bound: same growth rate

12

Example
Calculate
N

i 1

1
2

1
2N+2

4N

## Lines 1 and 4 count for one unit each

Line 3: executed N times, each time four units
Line 2: (1 for initialization, N+1 for all the tests, N
for all the increments) total 2N + 2
total cost: 6N + 4 O(N)

General Rules
For loops

## at most the running time of the statements inside

the for-loop (including tests) times the number of
iterations.

## the running time of the statement multiplied by the

product of the sizes of all the for-loops.
O(N2)

## General rules (contd)

Consecutive statements

O(N) + O(N2) = O(N2)

If/Else
never more than the running time of the test plus the
larger of the running times of S1 and S2.

Another Example
Maximum Subsequence Sum Problem
Given (possibly negative) integers A1,
A2, ...., An, find the maximum value ofj

A
k i

## For convenience, the maximum subsequence sum

is 0 if all the integers are negative

## E.g. for input 2, 11, -4, 13, -5, -2

Algorithm 1: Simple
Exhaustively tries all possibilities (brute force)

Name

Big-Oh

Comment

Constant

O(1)

## Cant beat it!

Log log

O(loglogN)

Extrapolation search

Linear

O(N)

N logN

O(NlogN)

O(N2)

Cubic

O(N3)

Exponential O(2N)

## Typical time for good searching

algorithms
This is about the fastest that an
algorithm can run given that we need
O(n) just to read the input

Polynomial time

Increasing cost

Logarithmic O(logN)

## Acceptable when the data size is

small (N<1000)
Acceptable when the data size is
small (N<1000)
Only good for really small input sizes
(n<=20)

18

Some Math
S(N) = 1 + 2 + 3 + 4 + N =

N ( N 1)
i

2
i 1
N

N * ( N 1) * (2n 1) N 3

Sum of Squares: i
6
3
i 1
N

N 1
A
1
i
A

A 1
i 0
N

Geometric Series:

N 1
1

A
i
A

(1)

1 A
i 0
N

A>1
A<1

19

## Some More Math

Linear Geometric
( n 1)
n
n
(
n

1
)
x

nx
x
Series: ix i x 2 x 2 3x 3 ... nx n
2
( x 1)

i 0

Harmonic Series:
Logs:

1
1 1
1
H n 1 ... (ln n) O(1)
2 3
n
i 1 i
log A B B * log A
log( A * B ) log A log B
A
log( ) log A log B
B
20

More on Summations

bounds:

a 1

i a

i 0

i 0

## f (i) f (i) f (i)

n

Linearity of Summations:

(4i
i 1

6i ) 4 i 6 i
i 1

i 1

21