Standard Basis Vectors
(unit norms)
New Basis Vector
(unit norm)
new basis vectors
(unit norm)
New Basis Vectors (unit norm)
low covariance
high variance
(Note that the eigenvectors of cA are the same as the eigenvectors of A)
eigenvectors of a symmetric matrix
diagonal
We know that n such vectors will exist since it is a symmetric matrix
These are called the principal components
Heuristics: k=50,100 or choose k such that λk/λmax > t
(this one time cost is then justified in the long run)
(diagonalisation leads to computational efficiency)
(diagonalisation leads to computational efficiency)
(orthonormal basis)
(all off-diagonal elements are 0)
(orthonormal)
(orthonormal)
(symmetric)
(true for all matrices)
(we don't know what such V and U are - we are just hoping that they exist)
null space
First r columns of this product will be and the last n-r columns will be 0
n-r 0 colums
m-r 0 rows
The last m-r columns of U will not contribute and hence the first r columns will be the same as and the last n-r columns will be 0
diagonal
orthogonal
orthogonal
we know that this always exists because A'A is a symmetric matrix
diagonal
orthogonal
orthogonal
we know that this always exists because AA' is a symmetric matrix
HW5:Prove that the non-0 eigenvalues of AA' and A'A are always equal
eigenvectors of AA'
transpose of the eigenvectors of A'A
square root of the eigenvalues of A'A or AA'
since they are eigenvectors of a symmetric matrix
so far we only know that these are the eigenvectors of AA'
so far we only know that these are the eigenvectors of A'A
Please work this out! You really need to see this on your own! HW5
n-r 0 columns
largest sigma
smallest sigma
we can sort these terms according to sigmas
Frobenius norm
rank-k approximation of A - dropped the last r - k terms
we will not prove this
We know that n such vectors will exist since it is a symmetric matrix
These are called the principal components
Heuristics: k=50,100 or choose k such that λk/λmax > t