Вы находитесь на странице: 1из 2

Multivariate Gaussian Mixture Models

Steps
1. Plug the columns of the feature vector youre given into each of a components 1-D PDFs, then take the product 2. Take the sum of the component weights multiplied with the products calculated above 3. Take the product of the sums above to give the joint probability of the feature vector Y belonging to a class C

Worked Example
Consider a 2 class classification problem, where each class is modelled using a 3 component multivariate (3dimensional) GMM with the parameters (means, variances and weights) as below: Class C1: Component 1: m1 = (0,0,0), v1 = (1,2,1) Component 2: m2 = (1,0,0), v2 = (1,1,1) Component 3: m3 = (1,1,1), v3 = (2,2,1) Weights: w1 = 0.5, w2 = 0.3, w3 = 0.2 Class C2: Component 1: m1 = (0,-1,0), v1 = (1,1,1) Component 2: m2 = (1,-1,0), v2 = (1,1,2) Component 3: m3 = (0, -1, -1), v3 = (2,2,1) Calculate to which class does the sequence of feature vectors Y = y1, , yT given below corresponds to (i.e., you should calculate P(Y|C1) and P(Y|C2)). Y = y1, y2, , y5 = ( | | | | )

Step 1

y1 = ( ). C1, component 1, y1:

( ( (

) = 0.399 ) = 0.282 ) = 0.399

C1, component 2, y1:

( ( (

) = 0.242 ) = 0.399 ) = 0.399

C3, component 3, y1:

( ( (

) = 0.22 ) = 0.22 ) = 0.242

Step 2
|

Step 3
Janco has written this as: P(Y|Ci), but I dont think this is what he really means. We want to work out the most probable class for a given feature vector, so I think it should be rearranged to P(Ci|Y) but its a shitty thing to lose marks on, so might be worth checking with him. Anyway, once youve done steps 1 and 2 for every yt, you can calculate the joint probability of the feature vector and some class C by multiplying all the terms together, i.e.

Вам также может понравиться