(this one time cost is then justified in the long run)
(diagonalisation leads to computational efficiency)
(diagonalisation leads to computational efficiency)
(orthonormal basis)
(all off-diagonal elements are 0)
(orthonormal)
(orthonormal)
(symmetric)
(true for all matrices)
(same argument)
(we don't know what such V and U are - we are just hoping that they exist)
null space
First r columns of this product will be and the last n-r columns will be 0
n-r 0 colums
m-r 0 rows
The last m-r columns of U will not contribute and hence the first r columns will be the same as and the last n-r columns will be 0
diagonal
orthogonal
orthogonal
we know that this always exists because is a symmetric matrix
diagonal
orthogonal
orthogonal
we know that this always exists because is a symmetric matrix
HW5:Prove that the non-0 eigenvalues of and are always equal
eigenvectors of
transpose of the eigenvectors of
square root of the eigenvalues of or
since they are eigenvectors of a symmetric matrix
so far we only know that these are the eigenvectors of
so far we only know that these are the eigenvectors of
Please work this out! You really need to see this on your own! HW5
n-r 0 columns
largest sigma
smallest sigma
we can sort these terms according to sigmas
Frobenius norm
rank-k approximation of A - dropped the last r - k terms
we will not prove this
Time Travel!