Академический Документы
Профессиональный Документы
Культура Документы
So far, we have seen how to decompose symmetric matrices into a product of orthogonal matrices (the eigenvector matrix) and a diagonal matrix (the eigenvalue matrix). What about non-symmetric matrices? The key insight behind SVD is to nd two orthogonal basis representations U and V such that for any matrix A, the following decomposition holds A=U VT where U and V are orthogonal matrices (so U U T = I and V V T = I)
is a diagonal matrix, whose entries are called singular values (hence the meaning behind the term SVD).
689:F03 p.23/28
We want to choose the orthogonal matrices in such a way that not only are they orthogonal, but we want the result of applying A to them to also be orthogonal. That is, the vi are are a orthonormal basis set (unit vectors) and moreover, Avi also orthogonal to each other. We can then construct the ui = set. h i h
Avi Avi
v1
v2
1 u1
2 u2
u1
u2
2 4
1 0
0 2
3 5
AV = U or AV V 1 = U V 1 or A = U V T
689:F03 p.24/28
How do we go about nding U and V ? Here is the trick. We can eliminate U from the equation AV = U by premultiplying A by its transpose: AT A = (U V T )T (U V T ) = (V T U T )(U V T ) = V T V T
Since AT A is always symmetric (why?), the above expression gives us exactly the familiar spectral decomposition we have seen before (namely, QQT ), except that now V represents the orthonormal eigenvector set of AT A. In a similar fashion, we can eliminate V from the equation AV = U by postmultiplying A by its transpose: AAT = (U V T ) (U V T )T = (U V T )(V T U T ) = U T U T
689:F03 p.25/28
Examples of SVD
A=4 2
A=4
2 1
2 1
689:F03 p.26/28
=A y
T 0
and
1 yi
j links to i
i links to j
x0 = Ax0 j
689:F03 p.27/28
This is basically doing an iterative SV D computation, and as k increases, the 2 largest 1 eigenvalues dominate. GoogleT M is doing an SVD over a matrix A of size 3 109 by 3 109 !
689:F03 p.28/28