Day 13:

Matrix representation of a linear map

Matrix representation

Example 1. Let \(D:\mathbb{P}_{2}\to\mathbb{P}_{1}\) be given by \[D(f(x)) = f'(x).\]

Fix the bases \(S = \{1,x,x^2\}\) for \(\mathbb{P}_{2}\) and \(T = \{1,x\}\) for \(\mathbb{P}_{1}\).

 

We compute \(D(v)\) for each \(v\in S\), and write the output as a linear combination of the vectors in \(T\):

\[D(1) = 0 = 0\cdot 1 + 0\cdot x\]

\[D(x) = 1 = 1\cdot 1 + 0\cdot x\]

\[D(x^{2}) = 2x = 0\cdot 1 + 2\cdot x\]

Then, the matrix representation of \(D\) with respect to \(S\) and \(T\) is 

\[\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 2\end{bmatrix}\]

Matrix representation

Example 2. Let \(D:\mathbb{P}_{2}\to\mathbb{P}_{1}\) be given by \[D(f(x)) = f'(x).\]

Fix the bases \(S = \{1,x,x^2\}\) for \(\mathbb{P}_{2}\) and \(T = \{1,2x\}\) for \(\mathbb{P}_{1}\).

 

We compute \(D(v)\) for each \(v\in S\), and write the output as a linear combination of the vectors in \(T\):

\[D(1) = 0 = 0\cdot 1 + 0\cdot (2x)\]

\[D(x) = 1 = 1\cdot 1 + 0\cdot (2x)\]

\[D(x^{2}) = 2x = 0\cdot 1 + 1\cdot (2x)\]

Then, the matrix representation of \(D\) with respect to \(S\) and \(T\) is 

\[\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\]

Note that this matrix representation is different from the previous one, but the linear map \(D\) is the same.

Matrix representation

Example 3. Let \(M:\mathbb{R}^{3}\to\mathbb{R}^{2}\) be given by \[M([x_{1}\ \ x_{2}\ \ x_{3}]^{\top}) = \begin{bmatrix} 1 & 2 & 0\\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}x_{1}\\ x_{2}\\ x_{3}\end{bmatrix}.\]

Fix the bases \[S = \left\{\begin{bmatrix}1\\ 0\\ 0\end{bmatrix},\begin{bmatrix}0\\ 1\\ 0\end{bmatrix},\begin{bmatrix}0\\ 0\\ 1\end{bmatrix}\right\}\quad\text{and}\quad T = \left\{\begin{bmatrix}1\\ 0\end{bmatrix},\begin{bmatrix}0\\ 1\end{bmatrix}\right\}\] for \(\mathbb{R}^3\) and \(\mathbb{R}^{2}\), respectively.

We compute \(M(x)\) for each \(x\in S\), and write the output as a linear combination of the vectors in \(T\):

\[M\left(\begin{bmatrix} 1\\ 0\\ 0\end{bmatrix}\right) = \begin{bmatrix} 1\\ 0\end{bmatrix} = 1\begin{bmatrix} 1\\ 0\end{bmatrix} + 0\begin{bmatrix} 0\\ 1\end{bmatrix}\]

 

Matrix representation

Example 3 continued.

\[M\left(\begin{bmatrix} 0\\ 1\\ 0\end{bmatrix}\right) = \begin{bmatrix} 2\\ 3\end{bmatrix} = 2\begin{bmatrix} 1\\ 0\end{bmatrix} + 3\begin{bmatrix} 0\\ 1\end{bmatrix}\]

\[M\left(\begin{bmatrix} 0\\ 0\\ 1\end{bmatrix}\right) = \begin{bmatrix} 0\\ 4\end{bmatrix} = 0\begin{bmatrix} 1\\ 0\end{bmatrix} + 4\begin{bmatrix} 0\\ 1\end{bmatrix}\]

Then, the matrix representation of \(M\) with respect to \(S\) and \(T\) is 

\[\begin{bmatrix} 1 & 2 & 0\\ 0 & 3 & 4\end{bmatrix}\]

Matrix representation

Example 4. Let \(M:\mathbb{R}^{3}\to\mathbb{R}^{2}\) be given by \[M([x_{1}\ \ x_{2}\ \ x_{3}]^{\top}) = \begin{bmatrix} 1 & 2 & 0\\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}x_{1}\\ x_{2}\\ x_{3}\end{bmatrix}.\]

Fix the bases \[S = \left\{\begin{bmatrix}1\\ 1\\ 1\end{bmatrix},\begin{bmatrix}1\\ 1\\ 0\end{bmatrix},\begin{bmatrix}1\\ 0\\ 0\end{bmatrix}\right\}\quad\text{and}\quad T = \left\{\begin{bmatrix}1\\ 1\end{bmatrix},\begin{bmatrix}1\\ -1\end{bmatrix}\right\}\] for \(\mathbb{R}^3\) and \(\mathbb{R}^{2}\), respectively.

We compute \(M(x)\) for each \(x\in S\), and write the output as a linear combination of the vectors in \(T\):

\[M\left(\begin{bmatrix} 1\\ 1\\ 1\end{bmatrix}\right) = \begin{bmatrix} 3\\ 7\end{bmatrix} = 5\begin{bmatrix} 1\\ 1\end{bmatrix} + (-2)\begin{bmatrix} 1\\ -1\end{bmatrix}\]

 

Matrix representation

Example 4 continued.

\[M\left(\begin{bmatrix} 1\\ 1\\ 0\end{bmatrix}\right) = \begin{bmatrix} 3\\ 3\end{bmatrix} = 3\begin{bmatrix} 1\\ 1\end{bmatrix} + 0\begin{bmatrix} 1\\ -1\end{bmatrix}\]

\[M\left(\begin{bmatrix} 1\\ 0\\ 0\end{bmatrix}\right) = \begin{bmatrix} 1\\ 0\end{bmatrix} = \tfrac{1}{2}\begin{bmatrix} 1\\ 1\end{bmatrix} + \tfrac{1}{2}\begin{bmatrix} 1\\ -1\end{bmatrix}\]

Then, the matrix representation of \(M\) with respect to \(S\) and \(T\) is 

\[\begin{bmatrix} 5 & 3 & \frac{1}{2}\\[1ex] -2 & 0 & \frac{1}{2}\end{bmatrix}\]

Matrix representation

Example. Let \(D:\mathbb{P}_{2}\to\mathbb{P}_{1}\) be given by \[D(f(x)) = f'(x).\]

Fix the bases \(S = \{1,x,x^2\}\) for \(\mathbb{P}_{2}\) and \(T = \{1,x\}\) for \(\mathbb{P}_{1}\).

Then, the matrix representation of \(D\) with respect to \(S\) and \(T\) is 

\[\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 2\end{bmatrix}\]

If we take an arbitrary vector \(v=ax^{2}+bx+c\in\mathbb{P}_{2}\), then the coefficients of \(v\) with respect to the basis \(S\) are \(c,b,a\) (NOTE THE ORDER). Arrange these as column vector, multiply by the matrix above, and we obtain

\[\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 2\end{bmatrix}\begin{bmatrix}c\\ b\\ a\end{bmatrix} = \begin{bmatrix}b\\ 2a\end{bmatrix}\] Use these as the coefficients on the vectors in \(T\) (ORDER MATTERS) and we get \(b+2ax\), which is \(D(v)\).

Matrix representation of a linear map

Let \(V\) and \(W\) be vector spaces with bases \(\{v_{i}\}_{i=1}^{n}\) and \(\{w_{i}\}_{i=1}^{m},\) respectively.

Let \(L:V\to W\) be a linear map.

For each \(j\in\{1,2,\ldots,n\}\) there are scalars \(a_{1j},a_{2j},\ldots,a_{mj}\) such that

\[L(v_{j}) = a_{1j}w_{1} + a_{2j}w_{2} + \cdots + a_{mj}w_{m} = \sum_{i=1}^{m}a_{ij}w_{i}\]

The matrix

\[\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{bmatrix}\]

is called the matrix representation of \(L\) with respect to the bases \(\{v_{i}\}_{i=1}^{n}\) and \(\{w_{i}\}_{i=1}^{m}\).

Coordinate vector

Let \(V\) be vector space with basis \(\{v_{i}\}_{i=1}^{n}\).

Given a vector \(v\in V\) and \(w\in W\), there are scalars \(b_{1},b_{2},\ldots,b_{n}\) such that

\[v=\sum_{i=1}^{n}b_{i}v_{i}.\]

The column vector

\[\begin{bmatrix} b_{1}\\ b_{2}\\ \vdots\\ b_{n}\end{bmatrix}\in\mathbb{R}^{n}\]

is the coordinate vector of \(v\) with respect to the basis.

Theorem. Let \(V\) and \(W\) be vector spaces with bases \(S=\{v_{i}\}_{i=1}^{n}\) and \(T=\{w_{i}\}_{i=1}^{m},\) respectively. Let \(L:V\to W\) be a linear map, and \(v\in V\). If \(A\in\mathbb{R}^{m\times n}\) is the matrix representation of \(L\) with respect to \(S\) and \(T\), and \(x\in\mathbb{R}^{n}\) is the coordinate vector of \(v\) with respect to \(S\), then \(Ax\) is the coordinate vector of \(L(v)\) with respect to \(T\).

Proof. Let \(A = [a_{ij}]\) and \(x=[b_{1}\ \ b_{2}\ \ \cdots\ \ b_{n}]^{\top}\), then

\[Ax=\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{bmatrix}\begin{bmatrix} b_{1}\\ b_{2}\\ \vdots\\ b_{n}\end{bmatrix} = \begin{bmatrix} a_{11}b_{1} + a_{12}b_{2} + \cdots + a_{1n}b_{n}\\ a_{21}b_{1} + a_{22}b_{2} + \cdots + a_{2n}b_{n}\\ \vdots\\ a_{m1}b_{1} + a_{m2}b_{2} + \cdots + a_{mn}b_{n}\end{bmatrix}\]

in particular, let \(c_{i}\) denote that \(i\)th row of \(Ax\), that is,

\[c_{i} = \sum_{j=1}^{n}a_{ij}b_{j}.\]

We wish to show that \([c_{1}\ \ c_{2}\ \ \cdots c_{m}]^{\top}\) is the coordinate vector of \(L(v)\).

Proof continued.

\[\sum_{i=1}^{m}c_{i}w_{i} = \sum_{i=1}^{m}\left(\sum_{j=1}^{n}a_{ij}b_{j}\right)w_{i} = \sum_{i=1}^{m}\sum_{j=1}^{n}a_{ij}b_{j}w_{i} = \sum_{j=1}^{n}\sum_{i=1}^{m}a_{ij}b_{j}w_{i}\]

\[ = \sum_{j=1}^{n}b_{j}\sum_{i=1}^{m}a_{ij}w_{i} = \sum_{j=1}^{n}b_{j}L(v_{j}) = L\left(\sum_{j=1}^{n}b_{j}v_{j}\right) = L(v).\]

\(\Box\)

Go down for that calculation without summation notation

Proof continued.

\[ \]

\[\sum_{i=1}^{m}c_{i}w_{i} = c_{1}w_{1}+c_{2}w_{2}+c_{3}w_{3} + \cdots + c_{m}w_{m}\]

\[= \big(a_{11}b_{1} + a_{12}b_{2} + a_{13}b_{3} + \cdots + a_{1n}b_{n}\big)w_{1}\]

\[+\big(a_{21}b_{1} + a_{22}b_{2} + a_{23}b_{3} + \cdots + a_{2n}b_{n}\big)w_{2}\]

\[+\big(a_{31}b_{1} + a_{32}b_{2} + a_{33}b_{3} + \cdots + a_{3n}b_{n}\big)w_{3}\]

\[+ \big(a_{m1}b_{1} + a_{m2}b_{2} + a_{m3}b_{3} + \cdots + a_{mn}b_{n}\big)w_{m}\]

\[\vdots\]

\[= a_{11}b_{1}w_{1} + a_{12}b_{2}w_{1} + a_{13}b_{3}w_{1} + \cdots + a_{1n}b_{n}w_{1}\]

\[+ a_{21}b_{1}w_{2} + a_{22}b_{2}w_{2} + a_{23}b_{3}w_{2} + \cdots + a_{2n}b_{n}w_{2}\]

\[+ a_{31}b_{1}w_{3} + a_{32}b_{2}w_{3} + a_{33}b_{3}w_{3} + \cdots + a_{3n}b_{n}w_{3}\]

\[+ a_{m1}b_{1}w_{m} + a_{m2}b_{2}w_{m} + a_{m3}b_{3}w_{m} + \cdots + a_{mn}b_{n}w_{m}\]

\[\vdots\]

Proof continued.

\[ \]

\[\sum_{i=1}^{m}c_{i}w_{i} = a_{11}b_{1}w_{1} + a_{12}b_{2}w_{1} + a_{13}b_{3}w_{1} + \cdots + a_{1n}b_{n}w_{1}\]

\[+ a_{21}b_{1}w_{2} + a_{22}b_{2}w_{2} + a_{23}b_{3}w_{2} + \cdots + a_{2n}b_{n}w_{2}\]

\[+ a_{31}b_{1}w_{3} + a_{32}b_{2}w_{3} + a_{33}b_{3}w_{3} + \cdots + a_{3n}b_{n}w_{3}\]

\[+ a_{m1}b_{1}w_{m} + a_{m2}b_{2}w_{m} + a_{m3}b_{3}w_{m} + \cdots + a_{mn}b_{n}w_{m}\]

\[\vdots\]

\[=b_{1}\big(a_{11}w_{1} + a_{21}w_{2} + a_{31}w_{3} + \cdots + a_{m1}w_{m}\big)\]

\[+ b_{2}\big(a_{12}w_{1} + a_{22}w_{2} + a_{32}w_{3} + \cdots + a_{m2}w_{m}\big)\]

\[+b_{3}\big(a_{13}w_{1} + a_{23}w_{2} + a_{33}w_{3} + \cdots + a_{m3}w_{m}\big)\]

\[\vdots\]

\[+b_{n}\big(a_{1n}w_{1} + a_{2n}w_{2} + a_{3n}w_{3} + \cdots + a_{mn}w_{m}\big)\]

Proof continued.

\[ \]

\[\sum_{i=1}^{m}c_{i}w_{i} = b_{1}\big(a_{11}w_{1} + a_{21}w_{2} + a_{31}w_{3} + \cdots + a_{m1}w_{m}\big)\]

\[+ b_{2}\big(a_{12}w_{1} + a_{22}w_{2} + a_{32}w_{3} + \cdots + a_{m2}w_{m}\big)\]

\[+b_{3}\big(a_{13}w_{1} + a_{23}w_{2} + a_{33}w_{3} + \cdots + a_{m3}w_{m}\big)\]

\[\vdots\]

\[+b_{n}\big(a_{1n}w_{1} + a_{2n}w_{2} + a_{3n}w_{3} + \cdots + a_{mn}w_{m}\big)\]

\[ = b_{1}L(v_{1}) + b_{2}L(v_{2}) + b_{3} L(v_{3}) + \cdots + b_{n}L(v_{n})\]

\[ = L(b_{1}v_{1}) + L(b_{2}v_{2}) + L(b_{3} v_{3}) + \cdots + L(b_{n}v_{n})\]

\[ = L(b_{1}v_{1}+b_{2}v_{2}) + L(b_{3} v_{3}) + \cdots + L(b_{n}v_{n})\]

\[ = L(b_{1}v_{1}+b_{2}v_{2} + b_{3} v_{3}) + \cdots + L(b_{n}v_{n})\]

\[\vdots\]

\[ = L(b_{1}v_{1}+b_{2}v_{2} + b_{3} v_{3} + \cdots + b_{n}v_{n}) = L(v)\]

Matrix representation of linear maps \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\)

Given a linear map \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\), and bases \(S = \{v_{i}\}_{i=1}^{n}\) and \(T = \{w_{i}\}_{i=1}^{m}\) for \(\mathbb{R}^{n}\) and \(\mathbb{R}^{m}\), respectively, we can talk about the matrix representation of \(L\) with respect to \(S\) and \(T\).

 

However, we will sometimes refer to the matrix representation of \(L\) without referring to bases. In that case, we take the bases for both spaces to be the standard basis:

Definition. The standard basis for \(\mathbb{R}^{d}\) is the set

\[\left\{\begin{bmatrix}1\\ 0\\ 0\\ \vdots\\ 0\end{bmatrix},\begin{bmatrix}0\\ 1\\ 0\\ \vdots\\ 0\end{bmatrix},\begin{bmatrix}0\\ 0\\ 1\\ \vdots\\ 0\end{bmatrix},\ldots,\begin{bmatrix}0\\ 0\\ 0\\ \vdots\\ 1\end{bmatrix}\right\}\subset\mathbb{R}^{d}.\]

The \(i\)th element in this set is called \(e_{i}\).

Matrix representation of linear maps \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\)

Let \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\) be a linear map, and let \(A\in\mathbb{R}^{m\times n}\) be the matrix representation of \(L\).

 

Note that the coordinate vector of \(x\) with respect to the standard basis is \(x\). Hence, by the previous theorem we have \(L(x) = Ax\) for all \(x\in\mathbb{R}^{n}\). In particular,

\[\operatorname{im}(L) = \{L(x) : x\in\mathbb{R}^{n}\} = \{Ax : x\in\mathbb{R}^{n}\} = C(A)\]

\[\operatorname{ker}(L) = \{x : L(x) = 0\} = \{x : Ax=0\} = N(A)\]

Hence, \(\operatorname{im}(L)\) is the column space of the matrix representation of \(L\), and \(\operatorname{ker}(L)\) is the nullspace of the matrix representation of \(L\). 

Matrix representation of linear maps \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\)

Again, note that the coordinate vector of \(x\) with respect to the standard basis is \(x\). Hence, if \(B\in\mathbb{R}^{m\times n}\) is the matrix representation of \(L\), then \(L(x) = Bx\).

 

Note that \(Be_{i}\) is the \(i\)th column of \(B\), and \(Ae_{i}\) is the \(i\)th column of \(A\). Hence, for each \(i\in\{1,2,\ldots,n\}\) we have

\[Ae_{i} = L(e_{i}) = Be_{i},\]

that is, the \(i\)th columns of \(A\) and \(B\) are equal for each \(i\). Therefore, \(A=B\), that is, the matrix representation of \(L\) is \(A\).

Since a matrix \(A\) gives us a linear map \(L\), we often use the notation \(\operatorname{im}(A)\) and \(\operatorname{ker}(A)\) for the image and kernel of \(L\).

Let \(A\) be an \(m\times n\) matrix, and define the linear map \(L:\mathbb{R}^{n}\to\mathbb{R}^{m}\) by \(L(x) = Ax.\)

Matrix of an isomorphism

Proposition. Let \(V\) and \(W\) be vector spaces with bases \(S = \{v_{i}\}_{i=1}^{n}\) and \(T = \{w_{i}\}_{i=1}^{m}\), respectively. A linear map \(L\) is an isomorphism if and only if the matrix representation of \(A\) with respect to \(S\) and \(T\) is invertible.

Proof. Suppose \(L\) is an isomorphism. This implies that \(\operatorname{im}(L) = W\) and \(\operatorname{ker}(L) = \{0\}\).

Suppose that \(x\in\mathbb{R}^{n}\) is in the nullspace of \(A\). Let \(v\in V\) be the vector whose coordinate vector is \(x\). By the previous theorem \(Ax\) is the coordinate vector of \(L(v)\). However, \(Ax=0\), and hence \(L(v) = 0\), that is, \(v\in\operatorname{ker}(L) = \{0\}\). We conclude that \(v=0\), and hence \(x=0\). This shows \(N(A) = \{0\}\).

A similar argument (which you should write out) shows that \(C(A) = \mathbb{R}^{m}.\) By the Rank-Nullity Theorem we have

\[n = \operatorname{rank}(A) + \operatorname{nullity}(A) = m+0 = m.\]

Matrix of an isomorphism

Proof. Thus, \(m=n\) and \(A\) is a square matrix.

Since \[\operatorname{rank}(A) = m = (\# \text{ rows of }A) = n = (\#\text{ columns of }A) ,\] we see that \(\operatorname{rref}(A)\) has a pivot in each column and each row. Therefore, \(\operatorname{rref}(A) = I\), and hence by a previous theorem \(A\) is invertible. \(\Box\)

Corollary. If \(V\) and \(W\) are isomorphic finite-dimensional vector spaces, then \(\operatorname{dim}(V) = \operatorname{dim}(W)\).

Proof. The assumption \(V\) and \(W\) are isomorphic means that there is a linear bijection \(L:V\to W\). The assumption that the spaces are finite dimensional means that there are finite bases \(S = \{v_{i}\}_{i=1}^{n}\) for \(V\) and \(T = \{w_{i}\}_{i=1}^{m}\) for \(W\). By the previous theorem, the matrix representation of \(L\) with respect to \(S\) and \(T\) is an invertible matrix, in particular, it is a square matrix. Therefore \(n=m\). \(\Box\)

End Day 13

Linear Algebra Day 13

By John Jasper

Linear Algebra Day 13

  • 464