Вы находитесь на странице: 1из 4

CSE 415 - Pattern Recognition Sample questions

Duda chapter 1 What is pattern recognition system? Briefly discuss a typical such system & its components. What is feature extraction, training data, test data, post processing? Briefly discuss a decision-cycle of a pattern recognition system. What is Supervised, unsupervised & Reinforcement learning? Duda chapter 2 What is Bayesian decision theory? Discuss the theory with necessary figures & equations. Suppose in a two category classification W1 & W2, the following table of data is given. The data is prepared from a sample of size 300 which contains 140 samples of W1 & 160 samples of W2. Feature (Length in cm) 10 cm 11 cm 12 cm 13 cm 14 cm 15 cm 16 cm 17 cm 18 cm 19 cm 20 cm Number of W1 samples 10 15 18 17 25 20 15 8 7 3 2 Number of W2 samples 5 8 7 5 10 12 18 22 28 25 20

Total 140 160 From the table of data, derive the equation of the decision boundary to classify W1 &W2.

In a two category classification problem (W1, W2), the prior probabilities are given, P (W1) =0.4 & P (W2) =0.6. If their distributions (P (x|W1) and P (x|W1) where x is a onedimensional feature vector) are the following,

And the Loss matrix is =

class I when the true class is j ]

, [where ij is the loss incurred in selecting

Find the likelihood ration. Derive the decision boundary with respect to the likelihood ratio. In a C category classification ( w1 , w2 , , wc) the Loss function for taking action when true class is Wj is defined by,

Derive the expression for conditional risk. /* Also practice the following,


Duda chapter 5 and ANN What is a Linear Discriminant Function? Describe a linear discriminant function in 2-category & multi-category classification problem. Suppose for a linear classifier, g(x) = Wt x + w0 is the decision hyper plane, here Wt is a weight vector & x is the feature vector. Now if xp is a point on the plane, for any point x with distance r from the plane, prove the following:

What are linearly separable & linearly not separable samples? Discuss it with figure showing the solution region with few points on the plane: For the following sample S of 2-D points, find out the solution region for class X and class Y. (Considering & not considering margin.) Sample S = { class x= {(1,5), (2,6),(2,5)}, class y={ (2,2),(4,1),(3,3)}} Simulations of the Perceptron algorithms: Batch Perceptron & Fixed Increment single sample perceptron.

Show by a suitable linear classifier perceptron algorithm how it separates the following sample S of points. Show simulation of each step, find the equation of the line L so that, if L > 0, it classifies class y if L <0, it classifies class x Sample S = { class x= {(1,5), (2,6),(2,5)}, class y={ (2,2),(4,1),(3,3)}} // in this case use the following algo: let the line L = a0 + a1 x1 +a2 x2

While(all the sample are not classified correctly) { For each point p=(x,y) in sample S { if(p=(x,y) is misclassified) { a1 = a1 + x ; a2 = a2 + y ; } if( p=(x,y) is in class x) a0 = a0 +1 ; else a0 = a0 -1 ; } } return L ; (practice yourself with more sample points) What is linear & non-linear classifier? Why xor function is not separable by linear classifier? Solve it using ANN. Solve a given set of points with ANN. (practice yourself with sample points)

Sergios chapter 8 Everything about edit distance. (Theory, Algo, DP simulationPractice well) 2-D logarithmic search & hierarchical search. (see exercise 8.6)