Day 16:
Inner products and orthogonality
Inner products
Definition. Given a vector space \(V\), a function \(\langle\cdot,\cdot\rangle:V\times V\to\mathbb{R}\) is called an inner product if given any \(u,v,w\in V\) and \(\alpha\in\mathbb{R}\) we have
- \(\langle u,v\rangle = \langle v,u\rangle\)
- \(\langle \alpha u,v\rangle = \alpha \langle u,v\rangle\)
- \(\langle u+v,w\rangle = \langle u,w\rangle + \langle v,w\rangle\)
- \(\langle u,u\rangle\geq 0\)
- If \(\langle u,u\rangle = 0\), then \(u=0\).
If a space \(V\) has an inner product, then \(V\) is called an inner product space.
These properties have names:
- Property 1 is called symmetry of the inner product
- Properties 2 and 3 mean that the inner product is linear in the first argument
- A function satisfying Property 4 is called positive semi-definite.
- A function satisfying Property 5 is called definite.
Dot Product
Given two vectors
\[u = \begin{bmatrix}u_{1}\\ u_{2}\\ \vdots\\ u_{d}\end{bmatrix}\quad\text{and}\quad v = \begin{bmatrix}v_{1}\\ v_{2}\\ \vdots\\ v_{d}\end{bmatrix}\]
in \(\R^{d}\) their dot product is given by
\[u\cdot v=u^{\top}v = \begin{bmatrix}u_{1} & u_{2} & \cdots & u_{d}\end{bmatrix}\begin{bmatrix}v_{1}\\ v_{2}\\ \vdots\\ v_{d}\end{bmatrix} = \sum_{i=1}^{d}u_{i}v_{i}\]
The dot product is an inner product on \(\mathbb{R}^{d}\), and hence \(\mathbb{R}^{d}\) with the dot product is an inner product space.
Unless explicitly stated otherwise, this is the inner product on \(\mathbb{R}^{d}\).
An inner product on \(\mathbb{P}_{n}\).
Given two vectors \(f(x),g(x)\in\mathbb{P}_{n}\) set
\[\langle f(x),g(x)\rangle = \int_{0}^{1}f(x)g(x)\, dx.\]
This function is an inner product on \(\mathbb{P}_{n}\).
This function is clearly symmetric. If \(f(x),g(x),h(x)\in\mathbb{P}_{n}\), and \(a\in\mathbb{R}\), then
\[\langle f(x)+ag(x),h(x)\rangle = \int_{0}^{1}(f(x)+ag(x))h(x)\, dx \]
\[= \int_{0}^{1} f(x)h(x)+ag(x)h(x)\, dx = \int_{0}^{1}f(x)h(x)\, dx + a\int_{0}^{1}g(x)h(x)\,dx\]
\[=\langle f(x),h(x)\rangle + a\langle g(x),h(x)\rangle\]
Hence, this function is linear in the first argument. Finally,
\[\langle f(x),f(x)\rangle = \int_{0}^{1} [f(x)]^{2}\,dx\] Clearly, \(\langle f(x),f(x)\rangle\geq 0\), with equality iff \(f(x)=0\) for all \(x\in[0,1]\).
Other inner products
Example. For \(f(x),g(x)\in\mathbb{P}_{n}\) and any two numbers \(a<b\) we can define the inner product
\[\langle f(x),g(x)\rangle = \int_{a}^{b}f(x)g(x)\, dx.\]
Example. Given vectors \(x,y\in\mathbb{R}^{d}\), and a diagonal matrix \(A\in\mathbb{R}^{d\times d}\) with all positive entries on the diagonal, the function
\[\langle x,y\rangle = x^{\top}Ay\]
is an inner product in \(\mathbb{R}^{d}\).
Example. For \(f(x),g(x)\in\mathbb{P}_{n}\) and any set \(\{x_{1},x_{2},\ldots,x_{n},x_{n+1}\}\subset\mathbb{R}\) with \(n+1\) distinct elements, we can define the inner product
\[\langle f(x),g(x)\rangle = \sum_{i=1}^{n+1}f(x_{i})g(x_{i}).\]
Orthogonality
Definition. We say that two vectors \(u\) and \(v\) are orthogonal if
\[\langle u,v\rangle=0.\]
Example. Consider the inner product space \(\mathbb{P}_{2}\) with the inner product given on the previous slide. The vectors \(1\) and \(x-\frac{1}{2}\) are orthogonal, since
\[\langle 1,x-\tfrac{1}{2}\rangle = \int_{0}^{1}x-\frac{1}{2}\, dx = 0.\]
Example. The vectors \(x=[1\ \ -1]^{\top}\) and \(y=[2\ \ 2]^{\top}\) in \(\mathbb{R}^{2}\) are orthogonal since
\[x\cdot y = [1\ \ -1]\begin{bmatrix}2\\ 2\end{bmatrix} =(1)(2) + (-1)(2) = 0.\]
Orthogonality of subspaces
Definition. Let \(U\) and \(W\) be subspaces of the same inner product space \(V\). We say that \(U\) and \(W\) are orthogonal if
\[\langle u,w\rangle = 0\quad\text{for all }u\in U\text{ and }w\in W.\]
Theorem. Assume \(A\) is an \(m\times n\) matrix. Then \(N(A)\) and \(C(A^{\top})\) are orthogonal subspaces.
Corollary. Assume \(A\) is an \(m\times n\) matrix. Then \(N(A^{\top})\) and \(C(A)\) are orthogonal subspaces.
Proof. By the definition of \(N(A)\) and \(C(A^{\top})\) we have \[Ax=0\quad \text{and}\quad y=A^{\top}z\ \text{for some }z.\]
Hence,
\[y\cdot x = y^{\top}x=(A^{\top}z)^{\top}x=z^{\top}Ax=z^{\top}0=0.\ \Box\]
Proof. Apply the theorem to \(A^{\top}\). \(\Box\)
Theorem. Let \(V\) be an inner product space. If \(S=\{x_{1},x_{2},\ldots,x_{k}\}\subset V\) is a collection of nonzero, pairwise orthogonal vectors, then \(S\) is independent.
Proof. Suppose there are scalars \(a_{1},a_{2},\ldots,a_{k}\) such that
\[\sum_{i=1}^{k}a_{i}x_{i} = 0.\]
For \(j\in\{1,2,\ldots,k\}\) we compute
\[\left\langle \sum_{i=1}^{k}a_{i}x_{i},x_{j}\right\rangle = \sum_{i=1}^{k}a_{i}\langle x_{i},x_{j}\rangle = a_{j}\langle x_{j},x_{j}\rangle.\]
Hence, for each \(j\in\{1,2,\ldots,k\}\) we have
\[a_{j}\langle x_{j},x_{j}\rangle = 0\]
Since \(x_{j}\neq 0\) we conclude that \(\langle x_{j},x_{j}\rangle >0\), and hence \(a_{j} = 0\) for each \(j\in\{1,2,\ldots,n\}\). \(\Box\)
Norms
Definition. Given an inner product space \(V\) and a vector \(v\in V\), then norm of \(v\) is the number
\[\|v\| = \sqrt{\langle v,v\rangle}.\]
Example. Consider the inner product space \(\mathbb{P}_{2}\) with the inner product given previously. If \(f(x) = x\), then
\[\|f(x)\|= \sqrt{\int_{0}^{1}x^2\, dx} = \sqrt{\frac{1}{3}}.\]
Example. Let \(x=[1\ \ -2]^{\top}\in\mathbb{R}^{2}\), then
\[\|x\| = \sqrt{1^2+(-2)^2} = \sqrt{5}.\]
Linear Algebra Day 16
By John Jasper
Linear Algebra Day 16
- 548