Вы находитесь на странице: 1из 45

Pattern Recognition

Speaker: Wen-Fu Wang Advisor: Jian-Jiun Ding E-mail: r96942061@ntu.edu.tw Graduate Institute of Communication Engineering National Taiwan University, Taipei, Taiwan, ROC

Outline
Introduction Minimum Distance Classifier Matching by Correlation Optimum statistical classifiers Matching Shape Numbers String Matching
2

Outline
Syntactic Recognition of Strings String Grammars Syntactic recognition of Tree Grammars Conclusions

Introduction
Basic pattern recognition flowchart

Sensor

Feature generation

Feature selection

Classifier design

System evaluation

Introduction
The approaches to pattern recognition developed are divided into two principal areas: decision-theoretic and structural The first category deals with patterns described using quantitative descriptors, such as length, area, and texture The second category deals with patterns best described by qualitative descriptors, such as the relational descriptors.
5

Minimum Distance Classifier


Suppose that we define the prototype of each pattern class to be the mean vector of the patterns of that class: 1 mj ! j=1,2,,W (1) xj N j xw j Using the Euclidean distance to determine closeness reduces the problem to computing the distance measures

Dj ( x) ! x 

j=1,2,,W

(2)
6

Minimum Distance Classifier


The smallest distance is equivalent to evaluating the functions

d j ( x) ! x

T j

1  2

T j

j=1,2,,W

(3)

The decision boundary between classes and for a minimum distance classifier is

dij ( x) ! di ( x)  d j ( x)
1 ! x ( i  j) ( 2
T i 

j=1,2,,W

(4)
j

)T ( j

)!0
7

Minimum Distance Classifier


Decision boundary of minimum distance classifier
3.aG G -- d 2 D

1.5

x2

Clas s C 0.5 Clas s C

1 2

0 -0.5 0 0.5 x 1
1

1.5

2.5

Minimum Distance Classifier


Advantages: 1. Unusual direct-viewing 2. Can solve rotation the question 3. Intensity 4. Chooses the suitable characteristic, then solves mirror problem 5. We may choose the color are one kind of characteristic, the color question then solve.
9

Minimum Distance Classifier


Disadvantages: 1. It costs time for counting samples, but we must have a lot of samples for high accuracy, so it is more samples more accuracy! 2. Displacement 3. It is only two features, so that the accuracy is lower than other methods. 4. Scaling
10

Matching by Correlation
We consider it as the basis for finding matches of a sub-image of size J v K within f ( x, y ) an image of M v size , where we assume that J e M and K e N

c( x, y ) ! f ( s, t )w( x  s , y  t )
s t
(5) for x=0,1,2,,M-1,y=0,1,2,,N-1

11

Matching by Correlation
Arrangement for obtaining the correlation of f and point ( x0 , y0 )
Origin K

at

J o M

( x0 , y0 )

! ( x0  s, y0  t )

f ( x, y )
12

Matching by Correlation
The correlation function has the disadvantage of being sensitive to changes in the amplitude of f and w For example, doubling all values of f doubles the value of c ( x, y ) An approach frequently used to overcome this difficulty is to perform matching via the correlation coefficient

[ f (s,t)  f (s, t)][w(x  s, y t)  w]


K (x, y) !
s t 1 2

2 2 [ f (s,t)  f (s, t)] [w(x  s, y t)  w] s s t t

The correlation coefficient is scaled in the range-1 to 1, independent of scale changes in the amplitude of f and

13

Matching by Correlation
Advantages: 1.Fast 2.Convenient 3.Displacement Disadvantages: 1.Scaling 2.Rotation 3.Shape similarity 4.Intensity 5.Mirror problem 6.Color can not recognition
14

Optimum statistical classifiers


The probability that a particular pattern x comes from class wi is denoted p ( wi x ) If the pattern classifier decides that x came from w j when it actually came from wi , it incurs a loss, denoted ij
W

rj ( x) !

L
k !1

kj

p ( wk x )
15

Optimum statistical classifiers


From basic probability theory, we know that p( A B) ! ? p( A) p( B A) A p( B)

1 rj ( x) ! Lkj p(x wk ) P(wk ) p( x) k !1


W

rj ( x) !

L
k !1

kj

p ( x wk ) P ( wk )
16

Optimum statistical classifiers


Thus the Bayes classifier assigns an unknown pattern x to class wi

k !1

ki

p ( x wk ) P (wk )

L
q !1

qj

p ( x wq ) P (wq )

Lij ! 1  H ij
rj ( x) ! (1  H kj ) p ( x wk ) P ( wk )
k !1 W

! p( x)  p( x w j ) p(w j )
17

Optimum statistical classifiers


The Bayes classifier then assigns a pattern x to class wi if,

p ( x)  p ( x wi ) P( wi )
or, equivalently, if

p ( x)  p ( x w j ) P( w j )

p ( x wi ) P ( wi ) " p ( x w j ) P ( w j )

d j ( x) ! p( x w j ) P ( w j )
18

Optimum statistical classifiers


Bayes Classifier for Gaussian Pattern Classes Let us consider a 1-D problem (n=1) involving two pattern classes (W=2) governed by Gaussian densities
d j ( x) ! p ( x w j ) P( w j )
 1 e ! 2TW j

( x m j )2 2W 2 j

P( w j )

j ! 1, 2
19

Optimum statistical classifiers


In the n-dimensional case, the Gaussian density of the vectors in the jth pattern class has the form

p( x w j ) !

1 (2T ) n 2 C j
12

1  ( x  m j )T C 1 ( x  m j ) j 2

20

Optimum statistical classifiers


Advantages: 1. The way always combine with other methods, then it got high accuracy Disadvantages: 1.It costs time for counting samples 2.It has to combine other methods

21

Matching Shape Numbers


Direction numbers for 4-directional chain code, and 8-directional chain code
1 3 2 1

5 3 6

22

Matching Shape Numbers


Digital boundary with resampling grid superimposed

23

Matching Shape Numbers


All shapes of order 4, 6,and 8
Order6 Order4 Chain code: 0321 Difference : 3333 Shape no. : 3333 Chain code: 003221 Difference : 303303 Shape no. : 033033

Order8

Chain code: 00332211 Difference : 30303030 Shape no. : 03030303

Chain code:03032211 Difference :33133030 Shape no. :03033133

Chain code: 00032221 Difference : 30033003 Shape no. : 00330033 24

Matching Shape Numbers


Advantages: 1. Matching Shape Numbers suits the processing structure simple graph, specially becomes by the line combination 2. Can solve rotation the question 3. Matching Shape Numbers most emphatically to the graph outline, Shape similarity also may completely overcome 4. The Displacement question definitely may overcome, because of this method emphatically to the relative position but is not to the position

25

Matching Shape Numbers


Disadvantages : 1. It can not uses for a hollow structure 2. Scaling is a shortcoming which needs to change, perhaps coordinates the alternative means 3. Intensity 4. Mirror problem 5. The color is unable to recognize
26

String Matching
Suppose that two region boundaries, a and b, are coded into strings denoteda1a2 ...an and b1b2 ..bm ,respectively Let E represent the number of matches between the two strings, where a match occurs in the kth position if ak ! bk

F ! max( a , b )  E
27

String Matching
A simple measure of similarity between a and b is the ratio

E E R ! ! F m ax ( a , b )  E
Hence R is infinite for a perfect match and 0 when none of the corresponding symbols in and match ( in this case)
28

String Matching
Simple staircase structure. Coded structure.
b b

b b b

29

String Matching
Advantages: 1.Matching Shape Numbers suits the processing structure simple graph, specially becomes by the line combination 2.Can solve rotation the question 3.Intensity 4.Mirror problem 5. Matching Shape Numbers most emphatically to the graph outline, Shape similarity also may completely overcome 6. The Displacement question definitely may overcome, because of this method emphatically to the relative position but is not to the position
30

String Matching
Disadvantages: 1.It can not uses for a hollow structure 2.Scaling 3.The color is unable to recognize

31

Syntactic Recognition of Strings String Grammars


When dealing with strings, we define a grammar as the 4-tuple G ! ( N , 7, P, S ) N is a finite set of variables called nonterminals, 7 is a finite set of constants called terminals, P is a set of rewriting rules called productions, S in N is called the starting symbol.
32

Syntactic Recognition of Strings String Grammars


Object represented by its skeleton primitives. structure generated by using a regular string grammar
b a c

33

Syntactic Recognition of Strings String Grammars


Advantages: 1.This method may use to a more complex structure 2.It is a good method for character set Disadvantages: 1.Scaling 2.Rotation 3.The color is unable to recognize 4.Intensity 5.Mirror problem
34

Syntactic Recognition of Tree Grammars


A tree grammar is defined as the 5-tuple G ! (N, 7, P, r, S) N and 7 are sets of non-terminals and terminals, respectively S is the start symbol, which in general can be a tree P is a set of productions of the form Ti p T j , where Ti and Tj are trees r is a ranking function that denotes the number of direct descendants(offspring) of a node whose label is a terminal in the grammar
35

Syntactic Recognition of Tree Grammars


Of particular relevance to our discussion are expansive tree grammars having productions of the form

where X 1 , X 2, ..., X n are not terminals and k is a terminal

36

Syntactic Recognition of Tree Grammars


An object Primitives used for representing the skeleton by means of a tree grammar

37

Syntactic Recognition of Tree Grammars


For example
a b c d e

(1) S p a M X1

(2) X 1 p b M X1

(3) X 1 p c N X2 X3

(4) X 2 p d M X2

(6) X 3 p e (5) X 2 p a M X3 (7) X 3 p a

38

Syntactic Recognition of Tree Grammars


Advantages: 1. This method may use to a more complex structure 2. It is a good method for character set 3. The Displacement question definitely may overcome, because of this method emphatically to the relative position but is not to the position
39

Syntactic Recognition of Tree Grammars


Disadvantages : 1. Scaling is a shortcoming which needs to change, perhaps coordinates the alternative means 2. Rotation 3. The color is unable to recognize 4. Intensity
40

Conclusions
The graph recognizes is covers the domain very widespread science, in the past dozens of years, all kinds of method is unceasingly excavated, also acts according to all kinds of probability statistical model and the practical application model but unceasingly improves. The graph recognizes applies to each different application domain, actually often also simultaneously entrusts with the entire wrap to recognize the system different appearance, which methods thus we certainly are unable to define to are "best" the graph recognize the method.

41

Conclusions
Summary the seven approach to pattern recognition, each methods has advantages and disadvantages respectively. Therefore, we have to understand each method preciously. Then we choose the adaptable method for efficiency and accuracy. The A method has obtained extremely good recognizing rate in some application and is unable to express the similar method applies mechanically in another application also can similarly obtain extremely good recognizing rate.

42

Conclusions
Below provides several possibilities solutions the method
1. Scaling problem we may the reference area solve. 2. Neural networks solves for rotation problem. 3.The color question besides uses RBG to solve also may use the spectrum to recognize differently. 4. Doing correlation with the reverse match filter for Intensity mirror problem 5. We can use the measure of area for a hollow structure

43

References
[1] R. C. Gonzolez, R. E. Woods, "Digital Image Processing, Second Edition", Prentice Hall 2002 [2] ," Matlab", 2005 [3] S. Theodoridis, K. koutroumbas, "Pattern Recognition", Academic Press 1999 [4] W. K. Pratt ,"Digital Image Processing, Third Edition", John Wiley & Sons 2001 [5] R. C. Gonzolez, R. E. Woods, S. L. Eddins, "Digital Image Processing Using MATLAB", Prentice Hall 2005 [6] , -Matlab, 2000 [7] J. Schurmann, " A Unified View of Statistical and Neural Approaches" Pattern Classification, Chap4, John Wiley & Sons, Inc., 1996
44

References
[8]K. Fukunaga, Introduction to Statistical Pattern Recognition, Second Edition, Academic Press, Inc.,1990 [9] E. Gose, R. Johnsonbaugh, and Steve Jost, "Pattern recognition and Image Analysis", Prentice Hall Inc., New Jersey, 1996 [10] Robert J. Schalkoff, "Pattern Recognition: Statical, Structural and Neural Approaches", Chap5, John Wiley & Sons, Inc., 1992 [11] J. S. Pan, F. R. Mclnnes, and M. A. Jack, "Fast Clustering Algorithm for Vector Quantization", Pattern Recognition 29, 511-518, 1996

45

Вам также может понравиться