CS6015: Linear Algebra and Random Processes

Lecture 16:  The Eigenstory begins, computing eigenvalues and eigenvectors.

Learning Objectives

What is the eigenstory? (the outline of it)

(for today's lecture)

What are eigenvalues and eigenvectors?

How do you compute them?

What are eigenvalues/eigenvectors of some special matrices?

The Eigenstory

real

imaginary

distinct

repeating

(basis)

powers of A

PCA

optimisation 

diagonalisation

How to compute eigenvalues/vectors?

What are the possible values?

What are the eigenvalues of some special matrices ?

What is the relation between the eigenvalues of related matrices?

What do eigen values reveal about a matrix?

What are some applications in which eigenvalues play an important role?

Identity

Projection

Reflection

Markov

Rotation

Singular

Orthogonal

Rank one

Symmetric

Permutation

det(A - \lambda I) = 0

trace

determinant

invertibility 

rank

nullspace

columnspace

(positive semidefinite matrices)

positive pivots

(independent eigenvectors)

(orthogonal eigenvectors)

... ...

(symmetric)

(A - \lambda I)\mathbf{x} = \mathbf{0}

\(A^\top\)

\(A^{-1}\)

\(AB\)

\(A^\top A\)

\(A+B\)

\(U\)

\(R\)

\(A^2\)

\(A + kI\)

distinct values

independent eigenvectors

\(\implies\)

steady state

(Markov matrices)

Intuition

\mathbf{x}

The span of a vector is a line \(c\mathbf{u}~\forall c \in \mathbb{R} \) 

What happens when a matrix hits a vector?

A\mathbf{x}
\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\begin{bmatrix} 1\\ 2\\ \end{bmatrix}
=\begin{bmatrix} 5\\ 4 \end{bmatrix}

Most vectors get knocked off their span

(i.e., most vectors change their direction)

But some special vectors stay on the span

\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\begin{bmatrix} 1\\ 1\\ \end{bmatrix}
=\begin{bmatrix} 3\\ 3 \end{bmatrix}

These are called eigenvectors and the scaling factor is called the eigenvalue

=3\begin{bmatrix} 1\\ 1 \end{bmatrix}
(i.e., they only get scaled or squished)

Definition

\mathbf{x}

The span of a vector is a line \(c\mathbf{u}~\forall c \in \mathbb{R} \) 

\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\begin{bmatrix} 2\\ 2\\ \end{bmatrix}
=\begin{bmatrix} 6\\ 6 \end{bmatrix}

Observation 1: 

=3\begin{bmatrix} 2\\ 2 \end{bmatrix}

If \(\mathbf{x}\) is an eigenvector, then so is \(c\mathbf{x}\)

(i.e., any vector in the span of x)

The corresponding eigenvalue also remains the same

The eigenvectors corresponding to an eigenvalue form a subspace

The vector \(\mathbf{x}\) is an eigenvector of A if  

A\mathbf{x} = \lambda\mathbf{x}

\(\lambda\) is the corresponding eigenvalue

Definition

\mathbf{x}

The vector \(\mathbf{x}\) is an eigenvector of A if  

A\mathbf{x} = \lambda\mathbf{x}

Observation 2: 

\(\lambda\) is the corresponding eigenvalue

If \(\mathbf{x}\) is an eigenvector, then it lies in the column space of A

(this is useful and we will return back to it later)

Definition

\mathbf{x}

The vector \(\mathbf{x}\) is an eigenvector of A if  

A\mathbf{x} = \lambda\mathbf{x}

Observation 3: 

\(\lambda\) is the corresponding eigenvalue

Eigenvectors only make sense for square matrices

(these two vectors can never be the same as they are in two different spaces)
A\mathbf{x} = \lambda\mathbf{x'}
m\times n
n\times1
m\times1

How do we compute the eigenvalues?

A\mathbf{x} = \lambda\mathbf{x}
\implies A\mathbf{x} = \lambda I\mathbf{x}
\implies A\mathbf{x} - \lambda I\mathbf{x} = 0
(we now have a matrix on both sides)
\implies (A - \lambda I)\mathbf{x} = 0
(trivial solution: x = 0 -- not very interesting)
(we are looking for a non-zero eigenvector)

\(\implies\) null space of \(A - \lambda I\) is non-zero

\(\implies\) columns of \(A - \lambda I\) are not independent

What does \(\mathbf{x}\) being non-zero imply? 

\(\implies\) \(A - \lambda I\) is not invertible (singular)

\(\implies\) \(det(A - \lambda I) = 0 \) 

\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\mathbf{x} = \lambda\mathbf{x}
\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\mathbf{x} = \lambda\begin{bmatrix} 1&0\\ 0&1\\ \end{bmatrix}\mathbf{x}
\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}\mathbf{x} - \lambda\begin{bmatrix} 1&0\\ 0&1\\ \end{bmatrix}\mathbf{x} = 0
\begin{bmatrix} 1-\lambda&2\\ 2&1-\lambda\\ \end{bmatrix}\mathbf{x}= 0

How do we compute the eigenvalues?

\(det(A - \lambda I) = 0 \) 

A=\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}
A-\lambda I=\begin{bmatrix} 1-\lambda&2\\ 2&1-\lambda\\ \end{bmatrix}
det(A-\lambda I)= (1-\lambda)\cdot(1-\lambda) - 2\cdot2 = 0
\lambda^2 - 2\lambda - 3 = 0
(\lambda - 3)(\lambda + 1) = 0
\lambda = 3, \lambda = -1
(this is called the characteristic equation)

How do we compute the eigenvectors?

\(det(A - \lambda I) = 0 \) 

A=\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}
(A-\lambda I)\mathbf{x} = 0
\lambda = 3, \lambda = -1
\begin{bmatrix} 1-3&2\\ 2&1-3\\ \end{bmatrix}\mathbf{x} = \mathbf{0}
(Ax = 0 - we know how to solve this)
\lambda = 3
\begin{bmatrix} -2&2\\ 2&-2\\ \end{bmatrix}\mathbf{x} = \mathbf{0}
\mathbf{x}=\begin{bmatrix} 1\\ 1\\ \end{bmatrix}
\begin{bmatrix} 1-(-1)&2\\ 2&1-(-1)\\ \end{bmatrix}\mathbf{x} = \mathbf{0}
\lambda = -1
\begin{bmatrix} 2&2\\ 2&2\\ \end{bmatrix}\mathbf{x} = \mathbf{0}
\mathbf{x}=\begin{bmatrix} 1\\ -1\\ \end{bmatrix}

The \(n \times n\) case

\(det(A - \lambda I) = 0 \) 

\begin{bmatrix} a_{11}-\lambda&a_{12}&a_{13}&\cdots&a_{1n}\\ a_{21}&a_{22-\lambda}&a_{23}&\cdots&a_{2n}\\ a_{31}&a_{32}&a_{33}-\lambda&\cdots&a_{3n}\\ \cdots&\cdots&\cdots&\cdots&\cdots\\ \cdots&\cdots&\cdots&\cdots&\cdots\\ a_{n1}&a_{n2}&a_{n3}&\cdots&a_{nn}-\lambda\\ \end{bmatrix}
Characteristic Equation:
\lambda^n + \cdots = 0

Observation: 

For a \(n \times n \) matrix

We expect \(n\) roots: \(n\) eigenvalues, \(n\) eigenvectors

(but sometimes things could go wrong)

What could go wrong?

The good case

The not-so-good case

A=\begin{bmatrix} 1&2\\ 2&1\\ \end{bmatrix}
\lambda = 3, \lambda = -1

\(n\) real, distinct eigenvalues

\(n\) independent eigenvectors

(I see a basis there - coming soon)
A=\begin{bmatrix} 0&-1\\ 1&0\\ \end{bmatrix}
\begin{vmatrix} 1-\lambda&2\\ 2&1-\lambda\\ \end{vmatrix} = 0
\lambda^2 - 2\lambda - 3 = 0
\begin{vmatrix} 0-\lambda&-1\\ 1&0-\lambda\\ \end{vmatrix} = 0
\lambda^2 + 1 = 0
\lambda = i, \lambda = -i

imaginary eigenvalues

imaginary eigenvectors

(In many real world applications, imaginary values are not good)
A=\begin{bmatrix} 3&1\\ 0&3\\ \end{bmatrix}
\begin{vmatrix} 3-\lambda&1\\ 0&3-\lambda\\ \end{vmatrix} = 0
(3 - \lambda)(3 - \lambda) = 0
\lambda = 3, \lambda = 3

repeating eigenvalues

\(< n\) independent eigenvectors

(I see an incomplete basis there - coming soon)

The Eigenstory

real

imaginary

distinct

repeating

\(A^\top\)

\(A^{-1}\)

\(AB\)

\(A^\top A\)

(basis)

powers of A

PCA

optimisation 

diagonalisation

\(A+B\)

\(U\)

\(R\)

\(A^2\)

\(A + kI\)

How to compute eigenvalues?

What are the possible values?

What are the eigenvalues of some special matrices ?

What is the relation between the eigenvalues of related matrices?

What do eigen values reveal about a matrix?

What are some applications in which eigenvalues play an important role?

Identity

Projection

Reflection

Markov

Rotation

Singular

Orthogonal

Rank one

Symmetric

Permutation

det(A - \lambda I) = 0

trace

determinant

invertibility 

rank

nullspace

columnspace

(positive semidefinite matrices)

positive pivots

(independent eigenvectors)

(orthogonal eigenvectors)

... ...

(symmetric)

(where are we?)

(characteristic equation)
(desirable)
(A - \lambda I)\mathbf{x} = \mathbf{0}

distinct values

independent eigenvectors

\(\implies\)

steady state

(Markov matrices)

EVs of some special matrices: Identity

I=\begin{bmatrix} 1&0&0\\ 0&1&0\\ 0&0&1\\ \end{bmatrix}
I=\begin{bmatrix} 1-\lambda&0&0\\ 0&1-\lambda&0\\ 0&0&1-\lambda\\ \end{bmatrix}
(1-\lambda)^3= 0
\lambda_1=1,\lambda_2=1,\lambda_3=1

repeating eigenvalues

but no shortage of eigenvectors

(every n dimensional vector is an eigenvector of I)

EVs of some special matrices:Projection

\(\mathbf{x}\) in column space of \(A\)

column space of A
\mathbf{p}
\mathbf{a}_1 \in \mathbb{R}^n
\mathbf{a}_2
P = A(A^\top A)^{-1}A^\top

If \(\mathbf{x} \in \mathbb{R}^n\) there are three possibilities

(Px = 1.x)

\(\mathbf{x}\) is orthogonal to column space of \(A\)

(Px = 0.x)

\(\mathbf{x}\) is at some angle  to the column space of \(A\)

(Px != c.x) - direction will change

Possible eigenvalues: 0, 1 (nothing else)

Corresponding eigenvectors: nullspace of \(A^\top\), column space of \(A\)

(angle = 0)
(angle = 90)
(angle != 0,90)

EVs of some special matrices:Rotation

A=\begin{bmatrix} cos\theta&-sin\theta\\ sin\theta&cos\theta\\ \end{bmatrix}
A=\begin{bmatrix} 0&-1\\ 1&0\\ \end{bmatrix}
\theta=90\degree

The purpose of a rotation matrix is to rotate (move) vectors to a different direction!

What is the question that we are asking?

(Which vectors will not move?)

Will we get a good answer?

(Obviously not!)
\lambda_1=i
u_1=\begin{bmatrix} i\\1 \end{bmatrix}
\lambda_2=-i
u_2=\begin{bmatrix} -i\\1 \end{bmatrix}

EVs of some special matrices:Singular

If \(A\) is singular (non-invertible), we know that

(non-zero)

(special solution(s) of Ax = 0)

Hence, 0 is always an eigenvalue of any singular (non-invertible) matrix

A\mathbf{x}=\mathbf{0}
A\mathbf{x}=\mathbf{0}\cdot\mathbf{x}

What are the corresponding eigenvectors?

I=\begin{bmatrix} 1&2&3\\ 1&2&3\\ 1&2&3\\ \end{bmatrix}
\lambda_1 = 0, \lambda_2 = 0, \lambda_3 = 6
\mathbf{x_1} = \begin{bmatrix} -3\\0\\1 \end{bmatrix}
\mathbf{x_2} = \begin{bmatrix} -2\\1\\0 \end{bmatrix}
\mathbf{x_3} = \begin{bmatrix} 1\\1\\1 \end{bmatrix}

 A is not invertible if and only if 0 is an eigenvalue of A

has a non-zero solution

Do EVs tell us anything about the rank?

 A is not invertible if and only if 0 is an eigenvalue of A

\(\implies\) if there is no 0 eigenvalue,  A is invertible

\(\implies\) if there is no 0 eigenvalue,  A has rank n

False argument:  

each 0 eigenvalue corresponds to one unique special solution of \(A\mathbf{x} = \mathbf{0}\)

\(\implies\) number of 0 eigenvalues = dimension of nullspace

\(\implies\) number of 0 eigenvalues = \(n - r\)

\(\implies r = n~-\) number of 0 eigenvalues 

(Case1: no 0 eigenvalue)
(Case2: one or more 0 eigenvalues)
(Counter Example)
I=\begin{bmatrix} 0&3&0\\ 0&0&3\\ 0&0&0\\ \end{bmatrix}
r=2\\ \lambda_1=\lambda_2=\lambda_3 = 0

Explore more in HW5

The Eigenstory

real

imaginary

distinct

repeating

\(A^\top\)

\(A^{-1}\)

\(AB\)

\(A^\top A\)

(basis)

powers of A

PCA

optimisation 

diagonalisation

\(A+B\)

\(U\)

\(R\)

\(A^2\)

\(A + kI\)

How to compute eigenvalues?

What are the possible values?

What are the eigenvalues of some special matrices ?

What is the relation between the eigenvalues of related matrices?

What do eigen values reveal about a matrix?

What are some applications in which eigenvalues play an important role?

Identity

Projection

Reflection

Markov

Rotation

Singular

Orthogonal

Rank one

Symmetric

Permutation

det(A - \lambda I) = 0

trace

determinant

invertibility 

rank

nullspace

columnspace

(positive semidefinite matrices)

positive pivots

(independent eigenvectors)

(orthogonal eigenvectors)

... ...

(symmetric)

(where are we?)

(desirable)

HW5

distinct values

independent eigenvectors

\(\implies\)

(characteristic equation)
(A - \lambda I)\mathbf{x} = \mathbf{0}

steady state

(Markov matrices)

Learning Objectives

What is the eigenstory? (the outline of it)

(achieved)

What are eigenvalues and eigenvectors?

How do you compute them?

What are eigenvalues/eigenvectors of some special matrices?