Вы находитесь на странице: 1из 37

Data Structures (CS 102)

Dr. Balasubramanian Raman


Associate Professor
Department of Computer Science and Engineering
Indian Institute of Technology Roorkee, INDIA
balaiitr@ieee.org
http://people.iitr.ernet.in/facultywebsite/balarfma/Website/
Office: ECE, S 227 or MCA Block 103

CS 102
Syllabus
Real Time Applications

Why This Course?


You will be able to evaluate the quality of a program
(Analysis of Algorithms: Running time and memory
space )
You will be able to write fast programs
You will be able to solve new problems
You will be able to give non-trivial methods to solve
problems.
(Your algorithm (program) will be faster than others.)

Algorithm:
A set of explicit, unambiguous finite steps, which
when carried out for a given set of initial condition to
produce the corresponding output and terminate in
finite time.

Program:
An implementation of an algorithm in some
programming languages

Data Structure:
Organization of data needed to solve the problem

Good Algo.?
Efficient
Running Time
Space Used

Running time depends on

Single vs Multi processor


Read or Write speed to Memory
32 bit vs 64 bit
Input -> rate of growth of time, Efficiency as a function
of input (number of bits in an input number, number
of data elements)

Time Complexity: Amount of computation


time (CPU time) where program needs to run
Space complexity: Amount of memory
program needs to run for completion

Measuring the Running Time


The C standard library provides a function
called clock (in header file time.h) that can
sometimes be used in a simple way to time
computations:
clock_t start, finish;
start = clock();
sort(x.begin(), x.end());
// Call to STL generic sort algorithm
finish = clock();
cout << "Time for sort (seconds): " <<
((double)(finish start))/CLOCKS_PER_SEC;

Limitations
It is necessary to implement and test the
algorithm in order to determine its running
time.
In order to compare two or more algorithms,
the
same
hardware
and
software
environments should be used.

How to Analyze Time Complexity?


Machine

- Single Processor
- 32 bit
- Sequential executer
- 1 unit time for
Arithmetic and Logical
Operations
- 1 unit time for
assignment and return

Few Examples
int Add(int a, int b)
{ return a+b;}

TAdd=1+1=2 units of
time
= Constant time

Sum of all elements in the list


int Sum_of_list(int A[], int n)
{int sum=0;
for (int i=0;i<n;i++)
Sum=sum+A[i];
Cost No. of Times
--------------return sum;
1
1
}
3
2
1

n+1
n
1

TSum_of_list=1+3(n+1)+2n+1
=5n+5

=cn+c

Tsum_of_Matrices =a*n^2+b*n+c
TAdd
=O(1)
TSum_of_list
= O(n)
Tsum_of_Matrices =O(n2)
Asymptotic Notation

Five Important Guidelines for finding


Time Complexity in a code
1.
2.
3.
4.
5.

Loops
Nested Loops
Consecutive Statements
if else statements
Logarithmic statements

Asymptotic Notations
O, , , o,
Defined for functions over the natural numbers.
Ex: f(n) = (n2).
Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare
two function sizes.
The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.

Asymptotic Notation
Big Oh notation (with a capital letter O, not a zero),
also called Landau's symbol, is a symbolism used in
complexity theory, computer science, and mathematics
to describe the asymptotic behavior of functions.
Basically, it tells you how fast a function grows or
declines.
Landau's symbol comes from the name of the German
number theoretician Edmund Landau who invented
the notation. The letter O is used because the rate of
growth of a function is also called its order.

Big-Oh Notation (Formal Definition)


Given functions f(n) and
g(n), we say that f(n) is
O(g(n)) if there are
positive constants
c and n0 such that
f(n) cg(n) for n n0
Example: 2n 10 is O(n)
2n 10 cn
(c 2) n 10
n 10(c 2)
Pick c 3 and n0 10

Big-Oh Example
Example: the function n2 is not O(n)
n2 cn
nc
The above inequality cannot be
satisfied since c must be a
constant
n2 is O(n2).

More Big-Oh Examples


7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 cn for n n0
this is true for c = 7 and n0 = 1

3n3 + 20n2 + 5

3n3 + 20n2 + 5 is O(n3)


need c > 0 and n0 1 such that 3n3 + 20n2 + 5 cn3 for n n0
this is true for c = 4 and n0 = 21

3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 clog n for n n0
this is true for c = 8 and n0 = 2

Big-Oh and Growth Rate


The big-Oh notation gives an upper bound on the growth
rate of a function
The statement f(n) is O(g(n)) means that the growth rate
of f(n) is no more than the growth rate of g(n)
Useful to find a worst case of an algorithm

Prove that

f (n) 5n 2n 1

is O(n 2 )

Write an efficient program to find whether


given number is prime or not.

AKS algorithm
Agrawal, Kayal and Saxena from IIT Kanpur
come up with O(log n) algorithm for a prime
number program
PRIMES is in P, Annals of Mathematics, 160(2):
781-793, 2004

Infosys Mathematics Prize, 2008.


Fulkerson Prize for the paper PRIMES is in P, 2006.
Gdel Prize for the paper PRIMES is in P", 2006.
Several awards and prizes

Programming Contest Sites


http://icpc.baylor.edu/public/worldMap/WorldFinals-2014 (ACM ICPC)
http://www.codechef.com (Online Contest)
http://www.topcoder.com
http://www.spoj.com
http://www.interviewstreet.com

Find the output of the following code


int n=32;
steps=0;
for (int i=1; i<=n;i*=2)
steps++;
cout<<steps;
Example of O(log n) algorithm.

A log10 (n) , B log 2 (n) both are of O(log n),


base doesnt matter,
Suppose xm log m n f (n) and xk log k n g (n)

xm

xm

n and

xk

xk

x m x k log m k
f ( n ) g ( n ) log m k cg ( n )
where c log m k
f ( n ) cg ( n )
f ( n ) and g ( n ) are of O( log n)

-notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
positive constants c and n0, such
that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions
whose rate of growth is the
same as or higher than that of
g(n).
g(n) is an asymptotic lower bound for f(n).

Prove that

f (n) 5n 2n 1

is (n 2 )

Example
(g(n)) = {f(n) : positive constants c and n0, such that
n n0, we have 0 cg(n) f(n)}

n = (log n). Choose c and n0.


for c=1 and n0 =16,

c * log n n , n 16

Omega
Omega gives us a LOWER BOUND on a
function.
Big-Oh says, "Your algorithm is at least this
good."
Omega says, "Your algorithm is at least this
bad."

-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)

}
Intuitively: Set of all functions that
have the same rate of growth as g(n).

g(n) is an asymptotically tight bound for f(n).

-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)

}
Technically, f(n) (g(n)).
Older usage, f(n) = (g(n)).
Ill accept either

f(n) and g(n) are nonnegative, for large n.

Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}

10n2 - 3n = (n2)
What constants for n0, c1, and c2 will work?
Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
To compare orders of growth, look at the
leading term.
Exercise: Prove that n2/2-3n= (n2)

Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}

Is 3n3 (n4) ??
How about 22n (2n)??

Examples
3n2 + 17
(1), (n), (n2) lower bounds
O(n2), O(n3), ... upper bounds
(n2) exact bound

Relations Between O,

o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n
approaches infinity:

lim [f(n) / g(n)] = 0

g(n) is an upper bound for f(n) that is not


asymptotically tight.
Observe the difference in this definition from
previous ones. Why?

Вам также может понравиться