# The bigger picture

\begin{bmatrix} ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ \end{bmatrix}
m < n

### a peek into the future

m=n
\begin{bmatrix} ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ \end{bmatrix}
\begin{bmatrix} ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~& \end{bmatrix}
m > n
rank =
A
A
A
(we are still focusing on this nice well-behaved case)
(These two are the more interesting cases that we will come to a bit later in the course)

# Things could still go wrong

### (even in the good case)

\begin{bmatrix} 1&2&-1\\ -1&-2&-3\\ 2&1&2 \end{bmatrix}
=\begin{bmatrix} 0\\ -4\\ 4 \end{bmatrix}
\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}
x_1 + 2x_2 -x_3 = 0
-x_1 -2x_2 -3x_3 = -4
2x_1 + x_2 + 2x_3 = 4
\begin{bmatrix} 1&2&-1\\ 0&0&-4\\ 0&-3&4 \end{bmatrix}
=\begin{bmatrix} 0\\ -4\\ 4 \end{bmatrix}
\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}
(row2 = row2 + row1)
E_{21}=\begin{bmatrix} 1&0&0\\ 1&1&0\\ 0&0&1 \end{bmatrix}
(row3 = row3 - 2*row1)
E_{31}=\begin{bmatrix} 1&0&0\\ 0&1&0\\ -2&0&1 \end{bmatrix}
(exchange row 2 and row 3)
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
=\begin{bmatrix} 0\\ 4\\ -4 \end{bmatrix}
\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}
P=\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
-4x_3 = -4 \therefore x_3 = 1
-3x_2 +4(1) = 4 \therefore x_2 = 0
x_1 + 2(0) - (1) = 0 \therefore x_1 = 1

# Things could still go wrong

### (even in the good case)

\begin{bmatrix} 1&2&-1\\ -1&-2&-3\\ 2&1&2 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 1&1&0\\ 0&0&1 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 0&1&0\\ -2&0&1 \end{bmatrix}
E_{21}
E_{31}
A
= \begin{bmatrix} 1&0&0\\ -1&1&0\\ 2&0&1 \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
P
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
=\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
U
A = E_{21}^{-1}E_{31}^{-1}P^{-1}U
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
= \begin{bmatrix} 1&0&0\\ -1&0&1\\ 2&1&0 \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4\\ \end{bmatrix}
not lower triangular
A

# Things could still go wrong

### (even in the good case)

\begin{bmatrix} 1&0&0\\ -1&1&0\\ 2&0&1 \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
A=
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
PA=
\begin{bmatrix} 1&0&0\\ 2&0&1\\ -1&1&0\\ \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 0&0&1\\ 0&1&0 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 2&1&0\\ -1&0&1\\ \end{bmatrix}
PA=
\begin{bmatrix} 1&2&-1\\ 0&-3&4\\ 0&0&-4 \end{bmatrix}
U
L
\underbrace{~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~}
\overbrace{~~~~~~~~~~~~~~~~~~~~~}
\underbrace{~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~}
\overbrace{~~~~~~~~~~~~~~~~~~~~~}

PA=LU

# The asymmetry in LU factorisation

\begin{bmatrix} 1&2&-1\\ 0&4&-4\\ 0&0&1 \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ -1&2&-3\\ 2&1&2 \end{bmatrix}=
\begin{bmatrix} 1 & 0 & 0\\ -1 & 1 & 0\\ 2 & -\frac{3}{4} & 1\\ \end{bmatrix}
A
L
U
(diagonal elements are 1)
(diagonal elements are not 1)
\begin{bmatrix} a&d&e\\ 0&b&f\\ 0&0&c \end{bmatrix}
=\begin{bmatrix} a&0&0\\ 0&b&0\\ 0&0&c \end{bmatrix}
\begin{bmatrix} 1&\frac{d}{a}&\frac{e}{a}\\ 0&1&\frac{f}{b}\\ 0&0&1 \end{bmatrix}

# LDU factorisation

\begin{bmatrix} 1&2&-1\\ 0&\frac{4}{4}&\frac{-4}{4}\\ 0&0&1 \end{bmatrix}
\begin{bmatrix} 1&2&-1\\ -1&2&-3\\ 2&1&2 \end{bmatrix}=
\begin{bmatrix} 1 & 0 & 0\\ -1 & 1 & 0\\ 2 & -\frac{3}{4} & 1\\ \end{bmatrix}
A
L
U
(diagonal elements are 1)
(diagonal elements are 1)
\begin{bmatrix} a&d&e\\ 0&b&f\\ 0&0&c \end{bmatrix}
=\begin{bmatrix} a&0&0\\ 0&b&0\\ 0&0&c \end{bmatrix}
\begin{bmatrix} 1&\frac{d}{a}&\frac{e}{a}\\ 0&1&\frac{f}{b}\\ 0&0&1 \end{bmatrix}
\begin{bmatrix} 1&0&0\\ 0&4&0\\ 0&0&1 \end{bmatrix}
D
PA=LDU

# What is the cost of Gauss Elimination?

\begin{bmatrix} x_1\\ x_2\\ x_3\\ x_4 \end{bmatrix}
\begin{bmatrix} a_{11}&a_{12}&\dots&a_{1n}\\ a_{21}&a_{22}&\dots&a_{2n}\\ \dots&\dots&\dots&\dots\\ a_{n1}&a_{n2}&\dots&a_{nn}\\ \end{bmatrix}
=\begin{bmatrix} 1\\ 2\\ 3\\ 2\\ \end{bmatrix}
A
\mathbf{x}
\mathbf{b}

# What is the cost of Gauss Elimination?

\begin{bmatrix} x_1\\ x_2\\ x_3\\ x_4 \end{bmatrix}
\begin{bmatrix} a_{11}&a_{12}&\dots&a_{1n}\\ a_{21}&a_{22}&\dots&a_{2n}\\ \dots&\dots&\dots&\dots\\ a_{n1}&a_{n2}&\dots&a_{nn}\\ \end{bmatrix}
=\begin{bmatrix} 1\\ 2\\ 3\\ 2\\ \end{bmatrix}
A
\mathbf{x}
\mathbf{b}
n(n-1) + (n-1)(n-2) + (n-2)(n-3) + \cdots + 2(1)
\approx \frac{1}{3}n^3

### Associated cost of changing b

(n-1) + (n-2) + \cdots 1
=\frac{1}{2}(n)(n-1)
= O(n^3)
= O(n^2)

# Why do we care about LU factorisation?

A = LU
A\mathbf{x} = \mathbf{b_1}
A\mathbf{x} = \mathbf{b_2}
A\mathbf{x} = \mathbf{b_3}
A\mathbf{x} = \mathbf{b_k}
\cdots~~\cdots

### irrespective of b the Gaussian elimination of A does not change

(so we can do it just once      )

### L contains all the information about the elimination steps

(so we can apply these steps to a new b in      time to get a c)
O(n^2)
O(n^3)

### We can then solve

U\mathbf{x} = \mathbf{c}

# One Lin. System = 2 Triangular Systems

A\mathbf{x} = \mathbf{b}
\equiv
U\mathbf{x} = \mathbf{c}

### How do you find c?

(triangular system)

### Prove that Lc = b

implication: c can be obtained by solving this triangular system

### What is the cost of solving a triangular system?

 hint: it better be less than
O(n^3)

### What is the net effect?

One time cost of elimination: ?
Recurring cost of solving Lc=b and Ux =c ?

# The bigger picture

\begin{bmatrix} ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ \end{bmatrix}
m < n

### a peek into the future

m=n
\begin{bmatrix} ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ \end{bmatrix}
\begin{bmatrix} ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~& \end{bmatrix}
m > n
rank =
A
A
A
(we are still focusing on this nice well-behaved case)
(These two are the more interesting cases that we will come to a bit later in the case)

# The bigger picture

\begin{bmatrix} ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ \end{bmatrix}
m < n

### a peek into the future

m=n
\begin{bmatrix} ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ ~~~&~~~&~~~&~~~&~~~\\ \end{bmatrix}
\begin{bmatrix} ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~&\\ ~~~&~~~& \end{bmatrix}
m > n
rank =
A
A
A
(we are still focusing on this nice well-behaved case)
(These two are the more interesting cases that we will come to a bit later in the case)

### Alternative method (for the good case)

\mathbf{x} = A^{-1}\mathbf{b}

# Is A invertible?

### Proof: (the only if part) If a nxn matrix A is invertible it has n non-0 pivots

given
to prove
given
to prove

# How to find the inverse of A

AA^{-1} = I
\begin{bmatrix} ~~&~~&~~\\ ~~&A&~~\\ ~~&~~&~~\\ \end{bmatrix}
\begin{bmatrix} \uparrow&\uparrow&\uparrow\\ \mathbf{x_1}&\mathbf{x_2}&\mathbf{x_3}\\ \downarrow&\downarrow&\downarrow\\ \end{bmatrix}
=\begin{bmatrix} 1&0&0\\ 0&1&0\\ 0&0&1\\ \end{bmatrix}
A\mathbf{x_1}
=\begin{bmatrix} 1\\ 0\\ 0\\ \end{bmatrix}
A\mathbf{x_2}
=\begin{bmatrix} 0\\ 1\\ 0\\ \end{bmatrix}
A\mathbf{x_3}
=\begin{bmatrix} 0\\ 0\\ 1\\ \end{bmatrix}

# Gauss Jordan method for finding

A^{-1}
\begin{bmatrix} 2&1&1&1&0&0\\ 4&-6&0&0&1&0\\ -2&7&2&0&0&1 \end{bmatrix}
row 2 = row 2 - 2*row1
row 3 = row 3 + row1
\begin{bmatrix} 2&1&1&1&0&0\\ 0&-8&-2&-2&1&0\\ 0&8&3&1&0&1 \end{bmatrix}
(row 3 = row 3 + row2)
\begin{bmatrix} 2&1&1&1&0&0\\ 0&-8&-2&-2&1&0\\ 0&0&1&-1&1&1 \end{bmatrix}
E_{31}E_{21}
E_{32}
\begin{bmatrix} 2&1&\mathbf{0}&2&-1&-1\\ 0&-8&\mathbf{0}&-4&3&2\\ 0&0&1&-1&1&1 \end{bmatrix}
row 2 = row 2 + 2*row3
row 1 = row 1 - row3
E_{13}E_{23}
row 1 = row 1 + 1/8*row2

\begin{bmatrix} 2&\mathbf{0}&\mathbf{0}&\frac{12}{8}&-\frac{5}{8}&-\frac{6}{8}\\ 0&-8&\mathbf{0}&-4&3&2\\ 0&0&1&-1&1&1 \end{bmatrix}
E_{12}
\begin{bmatrix} 1&\mathbf{0}&\mathbf{0}&\frac{12}{16}&\frac{-5}{16}&-\frac{6}{16}\\ 0&1&\mathbf{0}&\frac{4}{8}&-\frac{3}{8}&-\frac{2}{8}\\ 0&0&1&-1&1&1 \end{bmatrix}
M
=E
A
\mathbf{x_1}
\mathbf{x_2}
\mathbf{x_3}
 each row by the pivot

\div
\begin{bmatrix} 2&1\\ 0&3 \end{bmatrix}
=\begin{bmatrix} 2&0\\ 0&3 \end{bmatrix}
\begin{bmatrix} 1&\frac{1}{2}\\ 0&1 \end{bmatrix}
M=\begin{bmatrix} 1/2&0&0\\ 0&-1/8&0\\ 0&0&1 \end{bmatrix}

# Gauss Jordan method for finding

A^{-1}
\begin{bmatrix} 2&1&1&1&0&0\\ 4&-6&0&0&1&0\\ -2&7&2&0&0&1 \end{bmatrix}
\begin{bmatrix} 1&\mathbf{0}&\mathbf{0}&\frac{12}{16}&\frac{-5}{16}&-\frac{6}{16}\\ 0&1&\mathbf{0}&\frac{4}{8}&-\frac{3}{8}&-\frac{2}{8}\\ 0&0&1&-1&1&1 \end{bmatrix}
=
A
I
I
?
E
E
\begin{bmatrix} A&I \end{bmatrix}
=\begin{bmatrix} EA&EI \end{bmatrix}
=\begin{bmatrix} I&E \end{bmatrix}
EA=I
\therefore E=A^{-1}
E=A^{-1}
\begin{bmatrix} ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ ~~~&~~~&~~~\\ \end{bmatrix}
m=n
rank =
A
(we are still focusing on this nice well-behaved case)

# The two choices

A^{-1}

### Compute

A^{-1}\mathbf{x}
Cost: \approx\frac{1}{3}n^3
Cost: HW2
Cost: HW2
Cost: HW2
Cost: HW2

# Invertibility and number of solutions

domain~\mathbb{R}
range~\mathbb{R}
x_1
x_2
x_3
x_j
...
...
y_1
y_2
y_3
y_j
...
...
f

# Invertibility and number of solutions

domain~\mathbb{R}
range~\mathbb{R}
x_1
x_2
x_3
x_j
...
...
y_1
y_2
y_3
y_j
...
...
f

# Invertibility and number of solutions

domain~\mathbb{R}
range~\mathbb{R}
\mathbf{x}_1
\mathbf{x}_2
\mathbf{x}_3
\mathbf{x}_j
...
...
\mathbf{b}_1
\mathbf{b}_2
\mathbf{b}_3
\mathbf{b}_k
...
...
A\mathbf{x}

### some $$\mathbf{b}$$'s are not produced by any $$\mathbf{x}$$

(infinite solutions to Ax = b)
(0 solutions to Ax = b)

### a unique $$\mathbf{x}$$ for every $$\mathbf{b}$$

(exactly 1 solution to Ax = b for every possible b)