Академический Документы
Профессиональный Документы
Культура Документы
CS 102
Syllabus
Real Time Applications
Algorithm:
A set of explicit, unambiguous finite steps, which
when carried out for a given set of initial condition to
produce the corresponding output and terminate in
finite time.
Program:
An implementation of an algorithm in some
programming languages
Data Structure:
Organization of data needed to solve the problem
Good Algo.?
Efficient
Running Time
Space Used
Limitations
It is necessary to implement and test the
algorithm in order to determine its running
time.
In order to compare two or more algorithms,
the
same
hardware
and
software
environments should be used.
- Single Processor
- 32 bit
- Sequential executer
- 1 unit time for
Arithmetic and Logical
Operations
- 1 unit time for
assignment and return
Few Examples
int Add(int a, int b)
{ return a+b;}
TAdd=1+1=2 units of
time
= Constant time
n+1
n
1
TSum_of_list=1+3(n+1)+2n+1
=5n+5
=cn+c
Tsum_of_Matrices =a*n^2+b*n+c
TAdd
=O(1)
TSum_of_list
= O(n)
Tsum_of_Matrices =O(n2)
Asymptotic Notation
Loops
Nested Loops
Consecutive Statements
if else statements
Logarithmic statements
Asymptotic Notations
O, , , o,
Defined for functions over the natural numbers.
Ex: f(n) = (n2).
Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare
two function sizes.
The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.
Asymptotic Notation
Big Oh notation (with a capital letter O, not a zero),
also called Landau's symbol, is a symbolism used in
complexity theory, computer science, and mathematics
to describe the asymptotic behavior of functions.
Basically, it tells you how fast a function grows or
declines.
Landau's symbol comes from the name of the German
number theoretician Edmund Landau who invented
the notation. The letter O is used because the rate of
growth of a function is also called its order.
Big-Oh Example
Example: the function n2 is not O(n)
n2 cn
nc
The above inequality cannot be
satisfied since c must be a
constant
n2 is O(n2).
3n3 + 20n2 + 5
3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 clog n for n n0
this is true for c = 8 and n0 = 2
Prove that
f (n) 5n 2n 1
is O(n 2 )
AKS algorithm
Agrawal, Kayal and Saxena from IIT Kanpur
come up with O(log n) algorithm for a prime
number program
PRIMES is in P, Annals of Mathematics, 160(2):
781-793, 2004
xm
xm
n and
xk
xk
x m x k log m k
f ( n ) g ( n ) log m k cg ( n )
where c log m k
f ( n ) cg ( n )
f ( n ) and g ( n ) are of O( log n)
-notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
positive constants c and n0, such
that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions
whose rate of growth is the
same as or higher than that of
g(n).
g(n) is an asymptotic lower bound for f(n).
Prove that
f (n) 5n 2n 1
is (n 2 )
Example
(g(n)) = {f(n) : positive constants c and n0, such that
n n0, we have 0 cg(n) f(n)}
c * log n n , n 16
Omega
Omega gives us a LOWER BOUND on a
function.
Big-Oh says, "Your algorithm is at least this
good."
Omega says, "Your algorithm is at least this
bad."
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)
}
Technically, f(n) (g(n)).
Older usage, f(n) = (g(n)).
Ill accept either
Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}
10n2 - 3n = (n2)
What constants for n0, c1, and c2 will work?
Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
To compare orders of growth, look at the
leading term.
Exercise: Prove that n2/2-3n= (n2)
Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}
Is 3n3 (n4) ??
How about 22n (2n)??
Examples
3n2 + 17
(1), (n), (n2) lower bounds
O(n2), O(n3), ... upper bounds
(n2) exact bound
Relations Between O,
o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n
approaches infinity: