Вы находитесь на странице: 1из 2

CS648 : Randomized Algorithms

Semester II, 2014-15, CSE, IIT Kanpur


Assignment - 3 (due on 7th February 6:00 PM)
Note: Be very rigorous in providing any mathematical detail in support of your arguments. Also mention
any Lemma/Theorem you use.
1. Two-dimensional Pattern Matching
(a) First we shall discuss a different finger printing technique to solve one-dimensional pattern matching problem. The idea is to map any bit string s into a 2 2 matrix M (s), as follows.
i.
ii.
iii.
iv.

 
For the empty string , M () = 10 01
 
M (0) = 11 01
 
M (1) = 10 11
For non-empty strings x and y, M (xy) = M (x) M (y).

Show that this fingerprint function has the following properties.


i. M (x) is well defined for all x {0, 1} .
ii. M (x) = M (y) x = y.
iii. For x {0, 1}n , the entries in M (x) are bounded by Fibonacci number Fn .
By considering the matrices M (x) modulo a suitable prime p, show how you would perform
efficient randomized pattern matching in 1-dimension. Explain how you would implement this as
a real-time algorithm.
(b) Consider the two-dimensional version of the pattern matching problem. The text is an n n
matrix X, and the pattern is an m m matrix Y . A pattern match occurs if Y appears as a
(contiguous) sub-matrix of X. Get inspired from the randomized algorithm for part (a) above to
design a randomized algorithm for this 2-dimensional pattern matching. The running time should
be O(n2 + m2 ) and the error probability should be inverse polynomial in terms of n and m.
The hint is: How to convert 2-dimensional pattern matching to 1-dimensional pattern matching.
This hint will be expanded in the doubt clearing session on this weekend. Till then keep pondering
over it.
2. This question deals with Chernoff bound.
(a) How well did you understand the proof of Chernoff bound ?
Consider a collection X1 , Xn of n independent geometrically distributed random variables with
mean 2. Let X = ni=1 Xi and > 0.
i. Derive a bound on P(X (1 + )(2n)) by appyling the Chernoff bound to a sequence of
(1 + )(2n) fair coin tosses.
ii. Directly derive a Chernoff like bound on P(X (1 + )(2n)).
iii. Which bound is better?

(b) Estimating the biasness of a coin


Suppose you are given a biased coin that has P[HEADS] = p a, for some fixed a. This is
all that you know about p. Devise a procedure for estimating p by a value p such that you can
guarantee that
P[|p p| > p] <
for any choice of the constants 0 < a, , < 1. Let N be the number of times you need to flip the
biased coin to obtain this estimate. What is the smallest value of N for which you can still give
this guarantee?

Вам также может понравиться