# Orthogonal Complement
### Big Idea
The **orthogonal complement** $U^{\perp}$ of a subspace $U$ is the collection of all vectors that are orthogonal to every vector in $U$. This relationship splits the entire space into two completely perpendicular parts.
## Inner Product
:::info
**Definition**
The inner product (or dot product) of two vectors $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$ is a scalar value calculated as:
$$
\langle \mathbf{x} , \mathbf{y} \rangle = \sum_{k=1}^n x_k y_k = \mathbf{x}^T \mathbf{y}
$$
:::
This product has two critical geometric interpretations:
1. **Length/Norm:** The inner product of a vector with itself gives the square of its $L_2$ norm (length): $\langle \mathbf{x}, \mathbf{x} \rangle = \|\mathbf{x}\|^2_2$.
2. **Angle:** It relates to the angle θ between the vectors: $\langle \mathbf{x}, \mathbf{y} \rangle = \|\mathbf{x}\| \|\mathbf{y}\| \cos\theta$, $0 \leq \theta \leq \pi$.
:::warning
**Example:**
Let
$$
\mathbf{x} = \begin{bmatrix}1\\-1\\-3\\1\end{bmatrix}, \quad
\mathbf{y} = \begin{bmatrix}2\\1\\1\\0\end{bmatrix}
$$
* $\mathbf{x}\cdot \mathbf{y} = 1\cdot 2 + (-1)\cdot 1 + (-3)\cdot 1+1\cdot 0 = -2$
* $\|\mathbf{x}\|_2 = 2\sqrt{3}$
* $\|\mathbf{y}\|_2 = \sqrt{6}$
* Thus $$\cos\theta = \frac{\langle \mathbf{x}, \mathbf{y} \rangle}{\|\mathbf{x}\|_2\|\mathbf{y}\|_2}
= \frac{-2}{2\sqrt{3}\cdot\sqrt{6}} = \frac{-1}{3\sqrt{2}}$$ so $$\theta = \arccos\!\left(\tfrac{-1}{3\sqrt{2}}\right)$$
:::
:::danger
**Theorem (Cauchy–Schwarz inequality)**
$$
|\langle \mathbf{x}, \mathbf{y} \rangle| \;\leq\; \|\mathbf{x}\| \, \|\mathbf{y}\|
$$
Moreover, equality holds iff $\mathbf{x} = c\mathbf{y}, \; c\in\mathbb{R}$.
:::
:::danger
**Theorem (Triangle inequality)**
$$
\|\mathbf{x} + \mathbf{y}\| \;\leq\; \|\mathbf{x}\| + \|\mathbf{y}\|
$$
:::
## Orthogonal Vectors and Subspaces
### Orthogonal Vectors
:::info
**Definitions**
Two vectors $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$ are **orthogonal** if their inner product is zero: $\langle \mathbf{x}, \mathbf{y} \rangle = 0$. Geometrically, this means they meet at a $90^\circ$ angle.
* A set of vectors is *orthogonal* if every vector in the set is orthogonal to every other vector.
* The set is *orthonormal* if it is orthogonal and every vector in the set is a unit vector ($\|\mathbf{x}_k\| = 1$).
* The **Pythagoras' Theorem** generalizes to orthogonal vectors: $$\|\mathbf{x}_1 + \mathbf{x}_2 + \cdots + \mathbf{x}_n\|^2= \|\mathbf{x}_1\|^2 + \|\mathbf{x}_2\|^2 + \cdots + \|\mathbf{x}_n\|^2$$
:::
:::warning
**Examples**
1. $\mathbf{x} = \begin{bmatrix}1\\1\\1\end{bmatrix}, \; \mathbf{y} = \begin{bmatrix}-1\\-1\\2\end{bmatrix}$
* $\mathbf{x}$ and $\mathbf{y}$ are orthogonal since $$\langle \mathbf{x}, \mathbf{y} \rangle = 1\cdot (-1) + 1\cdot (-1) + 1\cdot 2 = 0$$
2. $\{\mathbf{e}_1, \ldots, \mathbf{e}_n\}$ = standard basis for $\mathbb{R}^n$
* $\langle \mathbf{e}_i, \mathbf{e}_j \rangle = \delta_{ij}$
* Thus $\{\mathbf{e}_1,\ldots,\mathbf{e}_n\}$ is an orthonormal basis.
:::
### Orthogonal Subspaces
:::info
**Definition**
Two subspaces, $U_1$ and $U_2$, are orthogonal $U_1 \perp U_2$ if every vector in $U_1$ is orthogonal to every vector in $U_2$.
:::
* If $U_1 \perp U_2$, their dimensions add up to the dimension of their sum: $$\dim U_1 + U_2 = \dim U_1 + \dim U_2$$
* This property imposes constraints: for example, two 2-dimensional planes in $\mathbb{R}^3$ cannot be orthogonal, as $2+2=4$ exceeds the ambient dimension of $3$.
## Orthogonal Complement
:::info
**Definition**
The orthogonal complement of a subspace $U \subseteq \mathbb{R}^n$ denoted $U^\perp$, is the subspace of all vectors $\mathbf{x} \in \mathbb{R}^n$ that are orthogonal to every vector $\mathbf{y} \in U$.
:::
* $U^\perp$ is a subspace of $\mathbb{R}^n$.
* The entire space $\mathbb{R}^n$ can be expressed as the direct sum of $U$ and $U^\perp$: $\mathbb{R}^n = U \oplus U^\perp$. This means any vector $\mathbf{x}\in \mathbb{R}^n$ can be uniquely decomposed into $\mathbf{x} = \mathbf{x}_U + \mathbf{x}_{U^\perp}$, where $\mathbf{x}_U\in U,\; \mathbf{x}_{U^\perp}\in U^\perp$.
* The dimensions of the complement spaces always add up to the dimension of the ambient space: $\dim U + \dim U^\perp = n$.
* The orthogonal complement of the orthogonal complement is the original subspace: $(U^\perp)^\perp = U$.
:::warning
**Examples**
1. In $\mathbb{R}^3$, $U_1 = \text{Span}\{\mathbf{e}_1, \mathbf{e}_2\}$ and $U_2 = \text{Span}\{\mathbf{e}_3\}$ are orthogonal ($U_1 \perp U_2$).
2. If $U = \text{Span}\{\mathbf{e}_1, \mathbf{e}_3\} \subseteq \mathbb{R}^5$, then $U^\perp = \text{Span}\{\mathbf{e}_2, \mathbf{e}_4, \mathbf{e}_5\}$.
:::
:::success
**Exercise**
1. Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3 \in \mathbb{R}^3$ be nonzero vectors. If $\mathbf{u}_1 \perp \mathbf{u}_2$, and $\mathbf{u}_2 \perp \mathbf{u}_3$, then $\mathbf{u}_1 \perp \mathbf{u}_3$.
2. Let $U, V \subset \mathbb{R}^n$ be subspaces such that $U \perp V$.
- (a) If $\dim(U)=m$, then $\dim(V) \leq n-m$.
- (b) If $\dim(U)=m$, then $\dim(V) = n-m$.
3. Determine whether the statement is True or False.
- (a) Let $S \subseteq \mathbb{R}^n$ be a subspace. If $\mathbf{u}\in\mathbb{R}^n$ such that $\mathbf{u}\neq 0$, then either $\mathbf{u}\in S$ or $\mathbf{u}\in S^\perp$.
- (b) Let $L_1 \subseteq \mathbb{R}^2$ be a line through the origin. There is a unique line $L_2$ through the origin such that $L_1 \perp L_2$.
- \(c) Let $L_1 \subseteq \mathbb{R}^3$ be a line through the origin. There is a unique line $L_2$ through the origin such that $L_1 \perp L_2$.
- (d) Let $U_1 \subseteq \mathbb{R}^4$ be a 2-dimensional subspace. There is a unique plane $U_2$ through the origin such that $U_1 \perp U_2$.
:::
## Fundamental Subspaces
:::info
**Definition**
For an $m \times n$ matrix $A$, there are four **fundamental subspaces**: the nullspace $\mathcal{N}(A)$, the range $\mathcal{R}(A)$ (column space), and the nullspace and range of its transpose, $\mathcal{N}(A^T)$ and $\mathcal{R}(A^T)$ (row space).
:::
:::danger
**Theorem (Orthogonal Decomposition Theorem)**
1. The nullspace of $A$ is the orthogonal complement of the range (column space) of $A^T$ (the row space): $\mathcal{N}(A) = \mathcal{R}(A^T)^{\perp}$.
2. The nullspace of $A^T$ (the left nullspace) is the orthogonal complement of the range of $A$: $\mathcal{N}(A^T) = \mathcal{R}^\perp(A)$.
:::
This theorem provides an alternative way to check if an equation $A \mathbf{x}= \mathbf{b}$ has a solution: $\mathbf{b}\in \mathcal{R}(A)$ if and only if $\mathbf{b}$ is orthogonal to every vector in $\mathcal{N}(A^T)$.
:::warning
**Example**
Consider
$$
A = \begin{bmatrix}
1&2&1&1\\
1 & 3 & 0&1 \\
2 & 5 & 1&2
\end{bmatrix}
$$
**Q:** Does there exist $\mathbf{x}$ such that $A\mathbf{x} = \begin{bmatrix}2\\1\\3\end{bmatrix}$?
#### Method 1: Gaussian elimination.
Compute
$$
\text{rref}\!\left( \Big[ A \;\Big|\; \begin{bmatrix}2\\1\\3\end{bmatrix}\Big] \right).
$$
#### Method 2: Check if $\begin{bmatrix}2\\1\\3\end{bmatrix}\in \mathcal{R}(A) = \mathcal{N}(A^T)^\perp$.
* $\text{rref}(A^T) =
\begin{bmatrix}
1 & 0 & 1 \\
0 & 1 & 1 \\
0 & 0 & 0\\
0&0&0
\end{bmatrix} \Rightarrow \mathcal{N}(A^T) = \text{Span}\left\{\begin{bmatrix}-1\\-1\\1\end{bmatrix}\right\}$
* Now check:
$$\begin{bmatrix}2\\1\\3\end{bmatrix} \perp \mathcal{N}(A^T) \;\;\iff\;\;\left\langle \begin{bmatrix}2\\1\\3\end{bmatrix}, \begin{bmatrix}-1\\-1\\1\end{bmatrix}\right\rangle = 0$$ which holds.
*Remark.* If we have many $\mathbf{x}$, then method 2 is more efficient.
:::
:::success
**Exercise**
4. Let $A = LU$ be the LU decomposition of $A$. Determine whether the statement is **True** or **False**.
- (a) $\mathcal{N}(A) = \mathcal{N}(U)$
- (b) $\mathcal{N}(A^T) = \mathcal{N}(U^T)$
- \(c) $\mathcal{R}(A) = \mathcal{R}(U)$
- (d) $\mathcal{R}(A^T) = \mathcal{R}(U^T)$
- (e) $\dim(\mathcal{R}(A)) = \dim(\mathcal{R}(U))$
5. Let $A$ be an $m \times n$ matrix and let $\{\mathbf{u}_1, \mathbf{u}_2\} \subset \mathbb{R}^n$ be a basis of the nullspace $\mathcal{N}(A)$.
- (a) Determine $\dim(\mathcal{R}(A^T))$
- (b) Determine $\dim(\mathcal{N}(A^T))$
6. Let $A$ be a $4 \times 4$ matrix such that $$A = LU = \begin{bmatrix}1 & 0 & 0 & 0 \\ 1 & 1 & 0 & 0 \\0 & 1 & 1 & 0 \\0 & 2 & 1 & 1\end{bmatrix} \begin{bmatrix}1 & -1 & 2 & -1 \\0 & 1 & -3 & 4 \\0 & 0 & 0 & 1 \\0 & 0 & 0 & 0\end{bmatrix}$$
- (a) Find a basis of $\mathcal{N}(A^T)$.
- (b) Find a basis of $\mathcal{R}(A^T)$.
7. Let $A$ be a matrix such that its LU decomposition is of the form $$A = LU =\begin{bmatrix}1 & 0 & 0 \\ * & 1 & 0 \\ * & * & 1\end{bmatrix}\begin{bmatrix}* & * & * & * \\0 & * & * & * \\0 & 0 & 0 & *\end{bmatrix}$$ where $*$ denotes a nonzero number.
- (a) Determine the dimension of $\mathcal{R}(A^T)$.
- (b) Determine the dimension of $\mathcal{N}(A^T)$.
8. Let $$U = \begin{bmatrix}1 & 0 & 0 & -\tfrac{1}{2} \\0 & 1 & 0 & \tfrac{5}{4} \\0 & 0 & 1 & \tfrac{3}{4} \\0 & 0 & 0 & 0\end{bmatrix}$$ be the reduced row echelon form of a matrix $A$.
- (a) What is the rank of $A$?
- (b) What is the dimension of $\mathcal{N}(A^T)$ ?
- \(c) Is the vector $\begin{bmatrix}1\\2\\4\\5\end{bmatrix}$ in the range of $A^T$?
8. Suppose that $A$ is a matrix with
$$\mathcal{R}(A) = \text{span}\left\{
\begin{bmatrix}1\\2\\1\\0\end{bmatrix},
\begin{bmatrix}1\\0\\2\\1\end{bmatrix}
\right\}, \quad \mathcal{R}(A^T) = \text{span}\left\{\begin{bmatrix}1\\1\\1\\1\end{bmatrix},
\begin{bmatrix}1\\1\\-1\\-1\end{bmatrix}
\right\}$$
- (a) How many rows and how many columns does $A$ have?
- (b) What are $\text{rank}(A)$, $\dim(\mathcal{N}(A))$, and $\dim(\mathcal{N}(A^T))$?
- \(c) For what numbers $a$ and $b$ are the vectors $$\begin{bmatrix}2\\a\\-1\\0\end{bmatrix},
\quad \begin{bmatrix}1\\0\\-1\\b\end{bmatrix}$$ contained in $\mathcal{N}(A^T)$?
- (d) Do the vectors you found in the previous part form a basis for $\mathcal{N}(A^T)$? Give a reason.
- (e) Does the equation $A\mathbf{x}=[1,1,1,1]^T$ have a solution? Give a reason. What about the equation $A^T\mathbf{x}=[1,1,1,1]^T$?
:::