# Day 24:

The Spectral Theorem

Theorem (The spectral theorem part I). If $$A$$ is an $$n\times n$$ symmetric matrix, and $$A\neq 0$$, then $$A$$ has a nonzero eigenvalue.

Proposition. Let $$A$$ is a symmetric matrix. If $$v$$ and $$w$$ are eigenvectors of $$A$$ with eigenvalues $$\lambda$$ and $$\mu$$ respectively, and $$\lambda\neq \mu$$, then $$v\cdot w = 0$$.

Lemma. Assume $$A$$ is a symmetric matrix. If $$v$$ is a unit norm eigenvector of $$A$$ with eigenvalue $$\lambda\neq 0$$, then

$A_{1} = A - \lambda vv^{\top}$

is a symmetric matrix with $$\text{rank}(A_{1}) \leq \text{rank} (A)-1.$$

Proof. That $$A_{1}$$ is symmetric follows from the fact that $$(vv^{\top})^{\top} = (v^{\top})^{\top}v^{\top} = vv^{\top}$$ and the fact that taking the transpose is linear.

Note that  $A_{1}v = Av-\lambda vv^{\top}v = \lambda v-\lambda v\|v\|^2 = \lambda v-\lambda v= 0.$ Therefore, $$v\in N(A_{1})$$.

Let $$\{e_{1},e_{2},\ldots,e_{k}\}$$ be an orthonormal basis for $$N(A)$$. Note that every nonzero element of $$N(A)$$ is an eigenvector of $$A$$ with eigenvalue $$0$$. Since $$\lambda\neq 0$$ we have$v\cdot e_{i} = 0\text{ for }i=1,2,\ldots,k.$ From this we deduce that each $$e_{1},\ldots,e_{k}$$ is in $$N(A_{1})$$. Moreover, $$\{e_{1},\ldots,e_{k},v\}$$ is a linearly independent set in $$N(A_{1}).$$ $$\Box$$

Example. Consider the symmetric matrix

$A =\left[\begin{array}{rrrr} -1 & -3 & 7 & 5\\ -3 & -1 & 5 & 7\\ 7 & 5 & -1 & -3\\ 5 & 7 & -3 & -1 \end{array}\right]$

$\text{rref}(A-4I) = \begin{bmatrix} 1 & 0 & 0 & 1\\ 0 & 1 & 0 & -1\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0\end{bmatrix}$

From this we can see

$v = \begin{bmatrix}-1\\ \phantom{-}1\\ -1\\ \phantom{-}1\end{bmatrix}$

is an eigenvector of $$A$$ with eigenvalue $$4$$

Example. Consider the symmetric matrix

$A =\left[\begin{array}{rrrr} -1 & -3 & 7 & 5\\ -3 & -1 & 5 & 7\\ 7 & 5 & -1 & -3\\ 5 & 7 & -3 & -1 \end{array}\right]$

$\text{rref}(A-4I) = \begin{bmatrix} 1 & 0 & 0 & 1\\ 0 & 1 & 0 & -1\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0\end{bmatrix}$

From this we can see

$v = \frac{1}{2}\begin{bmatrix}-1\\ \phantom{-}1\\ -1\\ \phantom{-}1\end{bmatrix}$

is an eigenvector of $$A$$ with eigenvalue $$4.$$ In the previous theorem we need an eigenvector with norm $$1.$$

Example continued.

$A - 4 vv^{\top}= \left[\begin{array}{rrrr} -1 & -3 & 7 & 5\\ -3 & -1 & 5 & 7\\ 7 & 5 & -1 & -3\\ 5 & 7 & -3 & -1 \end{array}\right] - \left[\begin{array}{rrrr} 1 & -1 & 1 & -1\\ -1 & 1 & -1 & 1\\ 1 & -1 & 1 & -1\\ -1 & 1 & -1 & 1\end{array}\right]$

$= \left[\begin{array}{rrrr} -2 & -2 & 6 & 6\\ -2 & -2 & 6 & 6\\ 6 & 6 & -2 & -2\\ 6 & 6 & -2 & -2 \end{array}\right]$

$\text{rref}(A) = \begin{bmatrix} 1 & 0 & 0 & -1\\ 0 & 1 & 0 & 1\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0\end{bmatrix}\text{ and }\text{rref}(A-4vv^{\top}) = \begin{bmatrix} 1 & 1 & 0 & 0\\ 0 & 0 & 1 & 1\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0\end{bmatrix}$

$$\text{rank}(A) = 3$$   and   $$\text{rank}(A-4vv^{\top})=2$$.

Example continued.

Note that $A_{1}:=A-4vv^{\top}= \left[\begin{array}{rrrr} -2 & -2 & 6 & 6\\ -2 & -2 & 6 & 6\\ 6 & 6 & -2 & -2\\ 6 & 6 & -2 & -2 \end{array}\right]$

is a symmetric matrix. Hence, we know that it has a nonzero eigenvalue. In particular $$8$$ is an eigenvalue, since

$\text{rref}(A_{1} - 8I) = \begin{bmatrix} 1 & 0 & 0 & -1\\ 0 & 1 & 0 & -1\\ 0 & 0 & 1 & -1\\ 0 & 0 & 0 & 0\end{bmatrix}$

From this we see that

$w=\frac{1}{2}\begin{bmatrix} 1 & 1 & 1 & 1\end{bmatrix}^{\top}$

is a unit norm eigenvector of $$A_{1}$$ with eigenvalue $$8.$$

Example continued.

$A_{1} - 8 ww^{\top}= \left[\begin{array}{rrrr} -2 & -2 & 6 & 6\\ -2 & -2 & 6 & 6\\ 6 & 6 & -2 & -2\\ 6 & 6 & -2 & -2 \end{array}\right] - \begin{bmatrix} 2 & 2 & 2 & 2\\ 2 & 2 & 2 & 2\\ 2 & 2 & 2 & 2\\ 2 & 2 & 2 & 2 \end{bmatrix}$

$= \left[\begin{array}{rrrr} -4 & -4 & 4 & 4\\ -4 & -4 & 4 & 4\\ 4 & 4 & -4 & -4\\ 4 & 4 & -4 & -4 \end{array}\right]$

$A_{2}: = A - 4vv^{\top} - 8ww^{\top}= \left[\begin{array}{rrrr} -4 & -4 & 4 & 4\\ -4 & -4 & 4 & 4\\ 4 & 4 & -4 & -4\\ 4 & 4 & -4 & -4 \end{array}\right]$

$\text{rank}(A_{2}) = 1$

Example continued.

$A_{2}=\left[\begin{array}{rrrr} -4 & -4 & 4 & 4\\ -4 & -4 & 4 & 4\\ 4 & 4 & -4 & -4\\ 4 & 4 & -4 & -4 \end{array}\right]$

Finally, $$y=\frac{1}{2}\begin{bmatrix} 1 & 1 & -1 & -1\end{bmatrix}^{\top}$$ is an eigenvector of $$A_{2}$$ with eigenvalue $$-16$$. We see that $$A_{2} - (-16)yy^{\top}$$ has rank zero, it must be the zero matrix!

$0 = A_{2} - (-16)yy^{\top} = A_{1} - 8ww^{\top} - (-16)yy^{\top}$

and hence

$= A - 4vv^{\top} - 8ww^{\top} - (-16)yy^{\top}$

$A = 4vv^{\top} + 8ww^{\top} +(-16)yy^{\top}$

Since $$v,w,$$ and $$y$$ are eigenvectors with eigenvalues $$4,8,$$ and $$-16$$ respectively, we see that $$\{v,w,y\}$$ is an orthonormal basis for $$C(A)$$.

$$z=\frac{1}{2}[1\ -1\ -1\ \ 1]^{\top}$$ is a unit norm eigenvector of $$A$$ with eigenvalue $$0$$.

$$+ 0zz^{\top}$$

### Rank 1 matrices

Suppose $$A\in\mathbb{R}^{m\times n}$$ has rank $$1$$.

This means that $$C(A)$$ is a $$1$$-dimensional space, that is, there is a non-zero vector $$x\in\mathbb{R}^{m}$$ such that $$C(A) = \operatorname{span}\{x\}$$.

Every column of $$A$$ is a scalar multiple of $$x$$:

$A = \begin{bmatrix} \vert & \vert & & \vert\\ a_{1}x & a_{2}x & \cdots & a_{n}x\\ \vert & \vert & & \vert\end{bmatrix} = xy^{\top}$

where $$y^{\top}=[a_{1}\ \ a_{2}\ \ \cdots\ \ a_{n}]$$.

Proposition. If $$A$$ is an $$m\times n$$ rank $$1$$ matrix, then there are vectors $$x\in\mathbb{R}^{m}$$ and $$y\in\mathbb{R}^{n}$$ such that $$A = xy^{\top}$$.

### Rank 1 matrices

It's easy to identify matrices that are rank $$1$$:

$\left[\begin{array}{rrr} 1 & 1 & 1\\ 2 & 2 & 2\end{array}\right]$

$\left[\begin{array}{rrr} 1 & 2 & 3\\ 4 & 5 & 6\end{array}\right]$

$\left[\begin{array}{rrr} 1 & 2 & 3\\ 2 & 4 & 6\end{array}\right]$

$\left[\begin{array}{rrr} 1 & 2 & 0\\ 2 & 4 & 0\end{array}\right]$

$\left[\begin{array}{rrr} 0 & 1 & 2\\ 0 & 2 & 4\\ 0 & 4 & 6\end{array}\right]$

$\left[\begin{array}{rrr} 1 & 2 & -1\\ 1 & 2 & -1\\ 0 & 0 & 0\end{array}\right]$

$\left[\begin{array}{rrr} 1 & 2 & 3\\ 2 & 4 & 6\\ 4 & 8 & 16\end{array}\right]$

# x

For each rank 1 matrix, find vectors $$x$$ and $$y$$ so that the matrix is equal to $$xy^{\top}$$.

Which of the following are rank $$1$$?

$\left[\begin{array}{rrr} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0\end{array}\right]$

# x

### Rank 1 matrices

$\left[\begin{array}{rrr} 1 & 1 & 1\\ 2 & 2 & 2\end{array}\right] = \begin{bmatrix}1\\2\end{bmatrix}\begin{bmatrix}1 & 1 & 1\end{bmatrix}$

$\left[\begin{array}{rrr} 1 & 2 & 3\\ 2 & 4 & 6\end{array}\right] = \begin{bmatrix}1\\ 2\end{bmatrix}\begin{bmatrix}1 & 2 & 3\end{bmatrix}\quad\text{or} \quad \begin{bmatrix}2\\ 4\end{bmatrix}\begin{bmatrix}1/2 & 1 & 3/2\end{bmatrix}$

$\left[\begin{array}{rrr} 1 & 2 & 0\\ 2 & 4 & 0\end{array}\right] = \begin{bmatrix}1\\ 2\end{bmatrix}\begin{bmatrix}1 & 2 & 0\end{bmatrix}$

$\left[\begin{array}{rrr} 1 & 2 & -1\\ 1 & 2 & -1\\ 0 & 0 & 0\end{array}\right] = \begin{bmatrix}1\\ 1\\ 0\end{bmatrix}\begin{bmatrix}1 & 2 & -1\end{bmatrix}$

Even though it's not rank $$1$$:$$\left[\begin{array}{rrr} 0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0\end{array}\right] = \begin{bmatrix}0\\ 0\\ 0\end{bmatrix}\begin{bmatrix}0 & 0 & 0\end{bmatrix}$$

Suppose $$V\subset\mathbb{R}^{n}$$ is a $$1$$-dimensional subspace.  If $$x\in\mathbb{R}^{n}$$ is a unit vector, then $$\{x\}$$ is an orthonormal basis for $$V=\operatorname{span}\{x\}$$, and $$xx^{\top}$$ is the orthogonal projection onto $$V$$.

Conversely, if $$P$$ is an orthogonal projection, and $$\operatorname{rank}(P)=1$$, then $$P$$ is clearly the orthogonal projection onto $$C(A)$$, which is a $$1$$-dimensional subspace. Hence, if we let $$x$$ be a unit vector in $$C(A)$$, then $$xx^{\top}$$ is the orthogonal projection onto $$C(A)$$, that is, $$P=xx^{\top}$$.

### Rank 1 projections

Theorem. A matrix $$P\in\mathbb{R}^{n\times n}$$ is a rank 1 projection if and only if there is a unit vector $$x\in\mathbb{R}^{n}$$ such that $$P=xx^{\top}$$.

$\left[\begin{array}{rrrr} -1 & -3 & 7 & 5\\ -3 & -1 & 5 & 7\\ 7 & 5 & -1 & -3\\ 5 & 7 & -3 & -1 \end{array}\right] = 4vv^{\top} + 8ww^{\top} +(-16)yy^{\top}+ 0zz^\top$

Recall that we showed:

Where $$\{v,w,y,z\}$$ is an orthonormal basis for $$\mathbb{R}^{4}$$.

From this we see that our symmetric matrix $$A$$ is a linear combination of rank $$1$$ projections. Moreover, the ranges of these rank $$1$$ projections are orthogonal.

Theorem (The Spectral Theorem Part II). Assume $$A$$ is an $$n\times n$$ symmetric matrix which is not the zero matrix. There is an orthonormal basis $$\{v_{1},v_{2},\ldots,v_{k}\}$$ for $$C(A)$$ and scalars $$\lambda_{1},\lambda_{2},\ldots,\lambda_{k}$$ such that

$A = \lambda_{1}v_{1}v_{1}^{\top} + \lambda_{2}v_{2}v_{2}^{\top} + \cdots + \lambda_{k}v_{k}v_{k}^{\top}.$

Theorem (The Spectral Theorem (Outer Product form). Assume $$A$$ is an $$n\times n$$ symmetric matrix. There is an orthonormal basis $$\{v_{1},v_{2},\ldots,v_{n}\}$$ for $$\R^{n}$$ and scalars $$\lambda_{1},\lambda_{2},\ldots,\lambda_{n}$$ such that

$A = \lambda_{1}v_{1}v_{1}^{\top} + \lambda_{2}v_{2}v_{2}^{\top} + \cdots + \lambda_{n}v_{n}v_{n}^{\top}.$

Proof. Let $$\{v_{k+1},\ldots,v_{n}\}$$ be an orthonormal basis for $$N(A)$$ and take $$\lambda_{k+1} = \cdots = \lambda_{n} =0.$$ $$\Box$$

Example. Consider the symmetric matrix

$A = \left[\begin{array}{rrr} 3 & 1 & -1\\ 1 & 3 & -1\\ -1 & -1 & 5\end{array}\right]$

Using the power method we find that $$6$$ is an eigenvalue, so we compute:

$\text{rref}(A-6I) = \left[\begin{array}{rrr} 1 & 0 & \frac{1}{2}\\[0.3ex] 0 & 1 & \frac{1}{2}\\[0.3ex] 0 & 0 & 0\end{array}\right]$

Hence, $$v_{1} = \frac{1}{\sqrt{6}}[-1\ -1\ \ \ 2]^{\top}$$ is a unit norm eigenvector of $$A$$ with eigenvalue $$6$$.

Set  $A_{1} = A - 6v_{1}v_{1}^{\top} = \left[\begin{array}{rrr}\!\! 3 & 1 & -1\\\!\! 1 & 3 & -1\\\!\! -1 & -1 & 5\end{array}\right] - \left[\begin{array}{rrr}\!\!1 & 1 & -2\\\!\! 1 & 1 & -2\\\!\! -2 & -2 & 4\end{array}\right] = \left[\begin{array}{rrr}2 & 0 & 1\\ 0 & 2 & 1\\ 1 & 1 & 1\end{array}\right]$

Example. We continue with

$A_{1} = \left[\begin{array}{rrr}2 & 0 & 1\\ 0 & 2 & 1\\ 1 & 1 & 1\end{array}\right]$

Using the power method is found that $$3$$ is an eigenvalue, so we compute:

$\text{rref}(A_{1}-3I) = \left[\begin{array}{rrr} 1 & 0 & -1\\ 0 & 1 & -1\\0 & 0 & 0\end{array}\right]$

Hence, $$v_{2} = \frac{1}{\sqrt{3}}[1\ \ 1\ \ 1]^{\top}$$ is a unit norm eigenvector of $$A_{1}$$ with eigenvalue $$3$$.

Set  $A_{2} = A_{1} - 3v_{2}v_{2}^{\top} = \left[\begin{array}{rrr}2 & 0 & 1\\ 0 & 2 & 1\\ 1 & 1 & 1\end{array}\right] - \left[\begin{array}{rrr} 1& 1 & 1\\ 1 & 1 & 1\\ 1 & 1 & 1\end{array}\right] = \left[\begin{array}{rrr}\!\! 1 & -1 &\phantom{-}0\\\!\! -1 & 1 & 0\\\!\! 0 & 0 & 0\end{array}\right]$

Example. We continue with

$A_{2} = \left[\begin{array}{rrr}\!\! 1 & -1 &\phantom{-}0\\\!\! -1 & 1 & 0\\\!\! 0 & 0 & 0\end{array}\right]$

Using the power method is found that $$2$$ is an eigenvalue, so we compute:

$\text{rref}(A-2I) = \left[\begin{array}{rrr} 1 & 1 & 0\\ 0 & 0 & 1\\0 & 0 & 0\end{array}\right]$

Hence, $$v_{3} = \frac{1}{\sqrt{2}}[-1\ \ 1\ \ 0]^{\top}$$ is a unit norm eigenvector of $$A_{2}$$ with eigenvalue $$2$$.

Set  $A_{3} = A_{2} - 2v_{3}v_{3}^{\top} = \left[\begin{array}{rrr}\!\! 1 & -1 &\phantom{-}0\\\!\! -1 & 1 & 0\\\!\! 0 & 0 & 0\end{array}\right] - \left[\begin{array}{rrr}\!\! 1 & -1 &\phantom{-}0\\\!\! -1 & 1 & 0\\\!\! 0 & 0 & 0\end{array}\right]= \left[\begin{array}{rrr}0 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0\end{array}\right]$

Example. Thus, we have

$A = \left[\begin{array}{rrr} 3 & 1 & -1\\ 1 & 3 & -1\\ -1 & -1 & 5\end{array}\right] = 6v_{1}v_{1}^{\top} + 3v_{2}v_{2}^{\top} + 2v_{3}v_{3}^{\top}$

$$+\ 3\left(\dfrac{1}{3}\left[\begin{array}{rrr} 1& 1 & 1\\ 1 & 1 & 1\\ 1 & 1 & 1\end{array}\right]\right)$$

$$=6\left(\dfrac{1}{6}\left[\begin{array}{rrr}\!\!1 &\!\! 1 &\!\! -2\\\!\! 1 &\!\! 1 &\!\! -2\\\!\! -2 &\!\! -2 &\!\! 4\end{array}\right] \right)$$

$$+\ 2\left(\dfrac{1}{2}\left[\begin{array}{rrr}\!\! 1 &\!\! -1 &\!\! \phantom{-}0\\\!\! -1 &\!\! 1 &\!\! 0\\\!\! 0 &\!\! 0 &\!\! 0\end{array}\right]\right)$$

Where $$\{v_{1},v_{2},v_{3}\}$$ is an orthonormal basis for $$\R^{3}$$.

Define the orthogonal matrix

$Q = \begin{bmatrix} \vert & \vert & \vert\\ v_{1} & v_{2} & v_{3}\\ \vert & \vert & \vert\end{bmatrix} = \left[\begin{array}{rrr} \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} &\!\! -\frac{1}{\sqrt{2}}\\ \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{3}} &\!\! \frac{1}{\sqrt{2}}\\ -\frac{2}{\sqrt{6}} & \frac{1}{\sqrt{3}} &\!\! 0\end{array}\right]$

Example.

$Q\begin{bmatrix} 6 & 0 & 0\\ 0 & 3 & 0\\ 0 & 0 & 2\end{bmatrix}Q^{-1} = \begin{bmatrix} \vert & \vert & \vert\\ v_{1} & v_{2} & v_{3}\\ \vert & \vert & \vert\end{bmatrix} \begin{bmatrix} 6 & 0 & 0\\ 0 & 3 & 0\\ 0 & 0 & 2\end{bmatrix} \begin{bmatrix} - & v_{1}^{\top} & -\\ - & v_{2}^{\top} & -\\ - & v_{3}^{\top} & - \end{bmatrix}$

$$=\begin{bmatrix} \vert & \vert & \vert\\ v_{1} & v_{2} & v_{3}\\ \vert & \vert & \vert\end{bmatrix} \begin{bmatrix} - & 6v_{1}^{\top} & -\\ - & 3v_{2}^{\top} & -\\ - & 2v_{3}^{\top} & - \end{bmatrix}$$

$= 6v_{1}v_{1}^{\top} + 3v_{2}v_{2}^{\top} + 2v_{3}v_{3}^{\top} = A$

Definition. A matrix $$A$$ is called orthogonally diagonalizable if there is an orthogonal matrix $$Q$$ and a diagonal matrix $$\Lambda$$ so that

$A = Q\Lambda Q^{-1}.$

Theorem (The Spectral Theorem (Matrix form)). If $$A$$ is a symmetric matrix, then $$A$$ is orthogonally diagonalizable.

Proof. By the Spectral Theorem (Outer Product form) we have an orthonormal basis $$\{v_{1},\ldots,v_{n}\}$$ and scalars $$\lambda_{1},\ldots,\lambda_{n}$$ such that

$A = \lambda_{1}v_{1}v_{1}^{\top} + \lambda_{2}v_{2}v_{2}^{\top} + \cdots + \lambda_{n}v_{n}v_{n}^{\top}.$

Then we define the orthogonal matrix $$Q$$ and the diagonal matrix $$\Lambda$$:

$Q = \begin{bmatrix} | &| & & |\\ v_{1} & v_{2} & \cdots & v_{n}\\ | &| & & |\end{bmatrix}\text{ and }\Lambda = \begin{bmatrix}\lambda_{1} & 0 & \cdots & 0\\ 0 & \lambda_{2} & & \vdots\\ \vdots & & \ddots & \vdots\\ 0 & \cdots & \cdots & \lambda_{n}\end{bmatrix}$

Then we see

$Q\Lambda Q^{\top} = \begin{bmatrix} | &| & & |\\ v_{1} & v_{2} & \cdots & v_{n}\\ | &| & & |\end{bmatrix} \begin{bmatrix} - & \lambda_{1}v_{1}^{\top} & - \\ - & \lambda_{2}v_{2}^{\top} & - \\ & \vdots & \\ - & \lambda_{n}v_{n}^{\top} & -\end{bmatrix}$

$=\lambda_{1}v_{1}v_{1}^{\top} + \lambda_{2}v_{2}v_{2}^{\top} + \cdots + \lambda_{n}v_{n}v_{n}^{\top}=A.\ \Box$

Example. Find an orthogonal matrix $$Q$$ and a diagonal matrix $$\Lambda$$ such that

$Q\Lambda Q^{-1} = \begin{bmatrix} 2 & 2 & 2\\ 2 & 2 & 2\\ 2 & 2 & 2\end{bmatrix}=:A$

We can see that $$v_{1} = \frac{1}{\sqrt{3}}[1\ 1\ 1]^{\top}$$ is an eigenvector of $$A$$ with eigenvalue $$6$$. Since $$A-6v_{1}v_{1}^{\top}$$ is the zero matrix, the only other eigenvalue of $$A$$ is zero.

$\text{rref}(A) = \begin{bmatrix} 1 & 1 & 1\\ 0 & 0 & 0\\ 0 & 0 & 0\end{bmatrix}$

and hence $$\left\{\begin{bmatrix} -1\\ \phantom{-}1\\ \phantom{-}0\end{bmatrix},\begin{bmatrix} -1\\ \phantom{-}0\\ \phantom{-}1\end{bmatrix}\right\}$$ is a basis for the eigenspace $$N(A)$$.

Using Gram-Schmidt, we find that $$\left\{\frac{1}{\sqrt{2}}\begin{bmatrix} -1\\ \phantom{-}1\\ \phantom{-}0\end{bmatrix},\frac{1}{\sqrt{6}}\begin{bmatrix} -1\\ -1\\ \phantom{-}2\end{bmatrix}\right\}$$ is an orthonormal basis for $$N(A)$$.

Example. Therefore

$A=\begin{bmatrix} 2 & 2 & 2\\ 2 & 2 & 2\\ 2 & 2 & 2\end{bmatrix} = \left[\begin{array}{rrr} \frac{1}{\sqrt{3}} &\!\! -\frac{1}{\sqrt{2}} &\!\! \frac{1}{\sqrt{6}}\\\!\! \frac{1}{\sqrt{3}} &\!\! \frac{1}{\sqrt{2}} &\!\! \frac{1}{\sqrt{6}}\\\!\! \frac{1}{\sqrt{3}} &\!\! 0 &\!\! -\frac{2}{\sqrt{6}}\end{array}\right]\begin{bmatrix} 6 & 0 & 0\\ 0 & 0 & 0\\ 0 & 0 & 0\end{bmatrix}\left[\begin{array}{rrr} \frac{1}{\sqrt{3}} &\!\! -\frac{1}{\sqrt{2}} &\!\! \frac{1}{\sqrt{6}}\\\!\! \frac{1}{\sqrt{3}} &\!\! \frac{1}{\sqrt{2}} &\!\! \frac{1}{\sqrt{6}}\\\!\! \frac{1}{\sqrt{3}} &\!\! 0 &\!\! -\frac{2}{\sqrt{6}}\end{array}\right]^{\top}$

We can also write out the outer product decomposition:

We can also write out the outer product decomposition:

$$A= 6\begin{bmatrix}1/\sqrt{3}\\ 1/\sqrt{3}\\ 1/\sqrt{3}\end{bmatrix}\begin{bmatrix} 1/\sqrt{3} & 1/\sqrt{3} & 1/\sqrt{3}\end{bmatrix} + 0\begin{bmatrix}1/\sqrt{2}\\ -1/\sqrt{2}\\ 0\end{bmatrix}\begin{bmatrix} 1/\sqrt{2} & -1/\sqrt{2} & 0\end{bmatrix}$$

$$+ 0\begin{bmatrix} -1/\sqrt{6}\\ -1/\sqrt{6}\\ 2/\sqrt{6}\end{bmatrix}\begin{bmatrix} -1/\sqrt{6} & -1/\sqrt{6} & 2/\sqrt{6}\end{bmatrix}$$

Theorem. (The Spectral Theorem) If $$A\in\mathbb{R}^{n\times n}$$ is a symmetric matrix, then there is an orthonormal basis $$\{v_{1},\ldots,v_{n}\}$$ for $$\mathbb{R}^{n}$$ consisting of eigenvectors of $$A$$, with associated eigenvalues $$\lambda_{1},\lambda_{2},\ldots,\lambda_{n}$$. Moreover, we have the following:

1. $A = \sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{\top}.$
2. $A = Q\Lambda Q^{\top} = Q\Lambda Q^{-1}.$ where $Q=\begin{bmatrix} \vert & \vert & & \vert\\ v_{1} & v_{2} & \cdots & v_{n}\\ \vert & \vert & & \vert\end{bmatrix}\quad\text{and}\quad \Lambda = \begin{bmatrix} \lambda_{1} & 0 & \cdots & 0\\ 0 & \lambda_{2} & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & \lambda_{n}\end{bmatrix},$

By John Jasper

• 569