Row reduction and elementary matrices
Definition 4. Let \(V\) be a subspace. A basis for \(V\) is a collection of vectors \(\{v_{1},v_{2},\ldots,v_{r}\}\subset V\) with the following properties:
If a sequence of vectors satisfy Property 1, then we say that \(\{v_{1},v_{2},\ldots,v_{r}\}\) spans \(V\).
If a sequence of vectors satisfy Property 2, then we say that \(\{v_{1},v_{2},\ldots,v_{r}\}\) is linearly independent.
Definition 5. Let \(v_{1},v_{2},\ldots,v_{r}\) be vectors in \(\R^M\). If there are scalars \(a_{1},a_{2},\ldots,a_{r}\) at least one of which is not zero such that \[a_{1}v_{1}+a_{2}v_{2}+\cdots+a_{r}v_{r} = 0,\] then we say that the vectors \(v_{1},v_{2},\ldots,v_{r}\) are (linearly) dependent.
Examples. 1) \(\left[\begin{matrix}1\\ 0\end{matrix}\right]\) and \(\left[\begin{matrix}2\\ 0\end{matrix}\right]\) are dependent since \(2\left[\begin{matrix}1\\ 0\end{matrix}\right]+(-1)\left[\begin{matrix}2\\ 0\end{matrix}\right]=0\)
2) \(\left[\begin{matrix}1\\ 0\\ -1\end{matrix}\right],\left[\begin{matrix}1\\ 1\\ 2\end{matrix}\right],\left[\begin{matrix}2.5\\ 1\\ 0.5\end{matrix}\right]\) are dependent since \[3\!\left[\begin{matrix}1\\ 0\\ -1\end{matrix}\right]\!\!+2\!\left[\begin{matrix}1\\ 1\\ 2\end{matrix}\right]\!-2\!\left[\begin{matrix}2.5\\ 1\\ 0.5\end{matrix}\right]\! =\! \left[\begin{matrix} 0\\ 0\\ 0\end{matrix}\right] \]
3) The zero vector is dependent. It is the only vector that is dependent.
Given vectors \(v_{1},v_{2},\ldots,v_{r}\), how can we figure out whether they are dependent?
Equivalent problems:
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. If \(A\) and \(B\) are row equivalent, then we write \(A\sim B\).
The row operations are the following:
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. If \(A\) and \(B\) are row equivalent, then we write \(A\sim B\).
The row operations are the following:
\[\begin{bmatrix} a & b & c\\ d & e & f\end{bmatrix} \sim \begin{bmatrix} d & e & f\\ a & b & c\end{bmatrix}\]
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. If \(A\) and \(B\) are row equivalent, then we write \(A\sim B\).
The row operations are the following:
\[\begin{bmatrix} a & b & c\\ d & e & f\end{bmatrix} \sim \begin{bmatrix} \beta a & \beta b & \beta c\\ d & e & f\end{bmatrix}\]
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. If \(A\) and \(B\) are row equivalent, then we write \(A\sim B\).
The row operations are the following:
\[\begin{bmatrix} a & b & c\\ d & e & f\end{bmatrix} \sim \begin{bmatrix} a+\beta d & b+\beta e & c+\beta f\\ d & e & f\end{bmatrix}\]
Example. Consider the matrix
\[\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\]
Multiply row \(2\) by \(\frac{1}{2}\):
\[\begin{bmatrix} 1 & 0 & 1\\ 0 & 2 & 4\\ 0 & 0 & 1\end{bmatrix}\sim\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Replace (Row 1) by (Row 1)-(Row 3):
\[\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\sim \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Swap (Row 1) and (Row 2):
\[\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\sim \begin{bmatrix} 1 & 0 & 1\\ 0 & 2 & 4\\ 0 & 0 & 1\end{bmatrix}\]
\[A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{matrix} b_{11} & b_{12} & \cdots & b_{1k}\\ b_{21} & b_{22} & & b_{2k}\\ \vdots & & \ddots & \vdots \\ b_{n1} & b_{n2} & \cdots & b_{nk}\end{matrix}\right]\]
\[ A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{matrix} \left[\begin{matrix}b_{11}\\ b_{21}\\ \vdots\\ b_{n1}\end{matrix}\right] & \left[\begin{matrix}b_{12}\\ b_{22}\\ \vdots\\ b_{n2}\end{matrix}\right] & \cdots & \left[\begin{matrix}b_{1k}\\ b_{2k}\\ \vdots\\ b_{nk}\end{matrix}\right] \end{matrix}\right] \]
\[ A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{matrix} b_{1} & b_{2} & \cdots & b_{k}\end{matrix}\right] \]
where \(b_{1}=\left[\begin{matrix}b_{11}\\ b_{21}\\ \vdots\\ b_{n1}\end{matrix}\right],\ b_{2}=\left[\begin{matrix}b_{12}\\ b_{22}\\ \vdots\\ b_{n2}\end{matrix}\right],\ldots, b_{k}=\left[\begin{matrix}b_{1k}\\ b_{2k}\\ \vdots\\ b_{nk}\end{matrix}\right]\)
\[AB=\left[\begin{matrix} Ab_{1} & Ab_{2} & \cdots & Ab_{k}\end{matrix}\right]\]
Then we define the product as follows:
\[A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{matrix} b_{11} & b_{12} & \cdots & b_{1k}\\ b_{21} & b_{22} & & b_{2k}\\ \vdots & & \ddots & \vdots \\ b_{n1} & b_{n2} & \cdots & b_{nk}\end{matrix}\right]\]
\(A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{array}{ccc} R_{1}\\ R_{2}\\ \vdots \\ R_{m}\end{array}\right]\)
where \(R_{j} = \left[\begin{matrix} b_{j1} & b_{j2} & \cdots & b_{jk}\end{matrix}\right]\) for each \(j=1,2,\ldots,n\)
\(A=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\quad B=\left[\begin{array}{ccc} [\begin{matrix} b_{11} & b_{12} & \cdots & b_{1k}\end{matrix}]\\ [\begin{matrix} b_{21} & b_{22} & \cdots & b_{2k}\end{matrix}]\\ \vdots \\ [\begin{matrix} b_{n1} & b_{n2} & \cdots & b_{nk}\end{matrix}]\end{array}\right]\)
\[AB=\left[\begin{matrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & & a_{2n}\\ \vdots & & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{matrix}\right]\left[\begin{matrix}R_{1}\\ R_{2}\\ \vdots \\ R_{n}\end{matrix}\right] = \left[\begin{matrix} a_{11}R_{1}+ a_{12}R_{2} + \cdots + a_{1n}R_{n}\\ a_{21}R_{1} + a_{22}R_{2} + \cdots + a_{2n}R_{n}\\ \vdots \\ a_{m1}R_{1} + a_{m2}R_{2} \cdots + a_{mn}R_{n}\end{matrix}\right]\]
Each row of \(AB\) is a linear combination of rows of \(B\).
Example. Consider the product
\[\begin{bmatrix} 1 & -2 & 3\\ 1 & 0 & 0\\ 0 & 0 & 4\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\]
Row 1 in the product is the linear combination:
\[1\cdot \begin{bmatrix} 0 & 2 & 4\end{bmatrix} + (-2)\cdot \begin{bmatrix} 1 & 0 & 1\end{bmatrix} + 3\cdot \begin{bmatrix} 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} -2 & 2 & 5\end{bmatrix}\]
Row 2 in the product is the linear combination:
\[1\cdot \begin{bmatrix} 0 & 2 & 4\end{bmatrix} + 0\cdot \begin{bmatrix} 1 & 0 & 1\end{bmatrix} + 0\cdot \begin{bmatrix} 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 0 & 2 & 4\end{bmatrix}\]
Row 3 in the product is the linear combination:
\[0\cdot \begin{bmatrix} 0 & 2 & 4\end{bmatrix} + 0\cdot \begin{bmatrix} 1 & 0 & 1\end{bmatrix} + 4\cdot \begin{bmatrix} 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 0 & 0 & 4\end{bmatrix}\]
Hence:
\[\begin{bmatrix} 1 & -2 & 3\\ 1 & 0 & 0\\ 0 & 0 & 4\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} -2 & 2 & 5\\ 0 & 2 & 4\\ 0 & 0 & 4\end{bmatrix}\]
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. The row operations are the following:
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Example:
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 1 & 0 & 1 & 2\\ 0 & -2 & 1 & 3\end{bmatrix}\)
Swap rows 2 and 3
\(\sim\)
\(\begin{bmatrix} 1 & 0 & 0\\ 0 & 0 & 1\\ 0 & 1 & 0\end{bmatrix}\)
\(=\)
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 1 & 0 & 1 & 2\\ 0 & -2 & 1 & 3\end{bmatrix}\)
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. The row operations are the following:
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Example:
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -6 & 3 & 9\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Replace row 2 with 3 times row 2
\(\sim\)
\(\begin{bmatrix} 1 & 0 & 0\\ 0 & 3 & 0\\ 0 & 0 & 1\end{bmatrix}\)
\(=\)
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -6 & 3 & 9\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Two matrices \(A\) and \(B\) are row equivalent if \(B\) can be obtained from \(A\) by some sequence of row operations. The row operations are the following:
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Example:
\(\begin{bmatrix} 0 & 2 & -1 & -4\\ 0 & -6 & 3 & 9\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Replace row 1 with row 1 plus (-2) times row 3
\(\sim\)
\(\begin{bmatrix} 1 & 0 & -2\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\)
\(=\)
\(\begin{bmatrix} 2 & 2 & 1 & 0\\ 0 & -2 & 1 & 3\\ 1 & 0 & 1 & 2\end{bmatrix}\)
\(\begin{bmatrix} 0 & 2 & -1 & -4\\ 0 & -6 & 3 & 9\\ 1 & 0 & 1 & 2\end{bmatrix}\)
Example. Consider the matrix
\[\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\]
Multiply row \(2\) by \(\frac{1}{2}\):
\[\begin{bmatrix} 1 & 0 & 0\\ 0 & \frac{1}{2} & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 1\\ 0 & 2 & 4\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Replace (Row 1) by (Row 1)-(Row 3):
\[\begin{bmatrix} 1 & 0 & -1\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Swap (Row 1) and (Row 2):
\[\begin{bmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 1\\ 0 & 2 & 4\\ 0 & 0 & 1\end{bmatrix}\]
Swap (Row 1) and (Row 3),
Multiply row \(2\) by \(\frac{1}{2}\),
Replace (Row 1) by (Row 1)-(Row 3):
\[\underbrace{\begin{bmatrix} 1 & 0 & -1\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & \frac{1}{2} & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1\end{bmatrix}}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Example. Consider the matrix
\[\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\]
\[\begin{bmatrix} 0 & 1 & -1\\ \frac{1}{2} & 0 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Note that every row operation can be undone by another row operation:
Row operation
Inverse row operation
Swap (Row 1) and (Row 2),
Multiply (Row \(2\)) by \(\frac{1}{2}\),
Replace (Row 1) by (Row 1)-(Row 3):
\[\begin{bmatrix} 0 & 1 & -1\\ \frac{1}{2} & 0 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
\[\begin{bmatrix} 1 & 0 & -1\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & \frac{1}{2} & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix}\]
Replace (Row 1) by (Row 1)+(Row 3),
Multiply (Row \(2\)) by \(2\),
Swap (Row 1) and (Row 2):
\[\begin{bmatrix} 0 & 1 & 0\\ 1 & 0 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & 2 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix}\]
\[\begin{bmatrix} 0 & 2 & 0\\ 1 & 0 & 1\\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 2\\ 0 & 0 & 1\end{bmatrix} = \begin{bmatrix} 0 & 2 & 4\\ 1 & 0 & 1\\0 & 0 & 1\end{bmatrix}\]
To summarize:
If \(A\sim B\) then there is a matrix \(R\), which is a product of elementary matrices, such that \(B=RA\).
Moreover, there is a matrix \(S\), which is also a product of elementary matrices such that \(A=SB\).
Next time we will prove:
Theorem. If \(A\) and \(B\) are \(M\times N\) matrices such that \(A\sim B\), then
\[\{x\in\mathbb{R}^{N} : Ax=0\} = \{x\in\mathbb{R}^{N} : Bx=0\}.\]
A matrix is in reduced row echelon form (RREF) if it satisfies all of the following:
a) \(\begin{bmatrix} 1 & 0 & 2\\ 0 & 1 & 0\end{bmatrix}\)
g) \(\begin{bmatrix} 0 & 1 & 2 & 0\\ 0 & 0 & 0 & 1\end{bmatrix}\)
b) \(\begin{bmatrix} 0 & 1 & 1\\ 1 & 0 & 0\end{bmatrix}\)
f) \(\begin{bmatrix} 1 & 2 & 0\\ 0 & 0 & 2\end{bmatrix}\)
c) \(\begin{bmatrix} 1 & 0 & 1\\ 0 & 0 & 0\\ 0 & 1 & 2 \end{bmatrix}\)
d) \(\begin{bmatrix} 1 & 0 & 1\\ 0 & 1 & 2\\ 0 & 0 & 1\\ 0 & 0 & 0\end{bmatrix}\)
Which of the following are in RREF?
X
X
X
X