---
title: Ch5-5
tags: Linear algebra
GA: G-77TT93X4N1
---
# Chapter 5 extra note 5
## Selected lecture notes
> Rank–Nullity Theorem
> Four fundamental subspaces
> least squares solutions
> solvability condition
### Rank–nullity theorem
**Remarks:**
It should be clear that, given a linear transformation $T:V\to W$,
$$
\text{Ker}(T) \oplus \text{Ker}(T)^\perp = V, \quad \text{Ran}(T)\oplus \text{Ran}(T)^\perp = W.
$$
**Proposition:**
Let $V = W_1\oplus W_2$, where $W_1$ and $W_2$ are finite-dimensional spaces, then
$$
\text{dim}(V) = \text{dim}(W_1)+\text{dim}(W_2).
$$
* Proof:
> (Hint of the proof)
> Let $\{{\bf u}_1, \cdots, {\bf u}_m\}$ be a basis of $W_1$ and $\{{\bf v}_1, \cdots, {\bf v}_n\}$ be a basis of $W_2$.
> Then show that $\{{\bf u}_1, \cdots, {\bf u}_m, {\bf v}_1, \cdots, {\bf v}_n\}$ is a basis of $V$.
**Proposition:**
Let $T:V\to W$ be linear, then
$$
\text{dim}(V) = \text{dim}(\text{Ker}(T))+\text{dim}(\text{Ker}(T)^\perp),
$$
and
$$
\text{dim}(W) = \text{dim}(\text{Ran}(T))+\text{dim}(\text{Ran}(T)^\perp).
$$
### The 'essential' part of a linear transformation
Let us define a map that is a restriction of $T$ to the domain $\text{Ker}(T)^\perp$ and its range $\text{Ran}(T)$:
$$
\tilde{T}:\text{Ker}(T)^\perp\subseteq V\to \text{Ran}(T)\subseteq W.
$$
**Lemma:**
$\tilde{T}$ is an isomorphism.
* Proof:
> (linearity) Trivial.
>
> (1-1) claim: $\text{Ker}(\tilde{T}) = \{{\bf 0}\}$.
> > Since $\text{Ker}(\tilde{T})$ is a vector subspace, ${\bf 0}\in \text{Ker}(\tilde{T})$.
> > So $\{{\bf 0}\}\subseteq \text{Ker}(\tilde{T})$.
> >
> > Suppose ${\bf u}\in \text{Ker}(\tilde{T})$, then we must have ${\bf u}\in\text{Ker}(T)^\perp$ and $\tilde{T}({\bf u})={\bf 0}$.
> > So, $T({\bf u})={\bf 0}$ and hence ${\bf u}\in \text{Ker}(T)$.
> > Therefore, ${\bf u}\in (\text{Ker}(T)^\perp \cap \text{Ker}(T))=\{{\bf 0}\}$.
> > That gives $\text{Ker}(\tilde{T})\subseteq \{{\bf 0}\}$.
>
> (onto)
> claim: Given ${\bf w}\in \text{Ran}(T)$, there exists a ${\bf u}\in\text{Ker}(T)^\perp$ such that $\tilde{T}({\bf u})={\bf w}$.
> > Given ${\bf w}\in \text{Ran}(T)$, there exists ${\bf v}\in V$ such that $T({\bf v})={\bf w}$.
> Since ${\bf v}\in V$, there exists ${\bf u}\in\text{Ker}(T)^\perp$ and ${\bf v}_k\in\text{Ker}(T)$ such that ${\bf v} = {\bf u} + {\bf v}_k$.
> We then have
> $$
> {\bf w} = T({\bf v}) = T({\bf u} + {\bf v}_k) = T({\bf u}) + T({\bf v}_k) = T({\bf u}) = \tilde{T}({\bf u}).
> $$
**Corollary:**
1. $\text{Ker}(T)^\perp$ and $\text{Ran}(T)$ are isomorphic.
2. $\text{dim}(\text{Ker}(T)^\perp) = \text{dim}(\text{Ran}(T))$.
**Rank–Nullity Theorem:**
Let $T:V\to W$ be linear, then
$$
\text{dim}(V) = \text{dim}(\text{Ker}(T)) + \text{dim}(\text{Ran}(T)).
$$
* Proof:
> $$
> \begin{align}
> \text{dim}(V) &= \text{dim}(\text{Ker}(T)) + \text{dim}(\text{Ker}(T)^\perp) \\
> &= \text{dim}(\text{Ker}(T)) + \text{dim}(\text{Ran}(T)).
> \end{align}
> $$
### Four fundamental subspaces
**Theorem:**
1. $\text{Ker}(T^*) = \text{Ran}(T)^\perp$.
* Proof:
> Let ${\bf x}\in \text{Ran}(T)^\perp$.
> $\Leftrightarrow$ $\langle{\bf x}, {\bf w}\rangle = 0, \quad \forall {\bf w}\in \text{Ran}(T)$.
> $\Leftrightarrow$ $\langle{\bf x}, T({\bf v})\rangle = 0, \quad \forall {\bf v}\in V$.
> $\Leftrightarrow$ $\langle T^*({\bf x}), {\bf v}\rangle = 0, \quad \forall {\bf v}\in V$.
> $\Leftrightarrow$ $T^*({\bf x})={\bf 0}$.
> $\Leftrightarrow$ ${\bf x}\in \text{Ker}(T^*)$.
3. $\text{Ker}(T) = \text{Ran}(T^*)^\perp$.
4. $\text{Ran}(T) = \text{Ker}(T^*)^\perp$.
5. $\text{Ran}(T^*) = \text{Ker}(T)^\perp$.
**Corollary:**
Let $T:V\to W$ be linear, then
$$
\text{Ker}(T) \oplus \text{Ran}(T^*) = V, \quad \text{Ran}(T)\oplus \text{Ker}(T^*) = W.
$$
### Least squares solution
Let $T:V\to W$ be a linear transformation, we consider solving the problem $T({\bf v})={\bf w}$, where ${\bf w}$ is a given vector.
* If ${\bf w}\in \text{Ran}(T)$, there exists solutions.
* If ${\bf w}\in \text{Ran}(T)$ and $\text{Ker}(T)={\bf 0}$, there exists an unique solution.
* If ${\bf w}\notin \text{Ran}(T)$, there does not exist any solution.
* We can define the least squares solution as
$$
{\bf v}^* = \text{arg}\min_{{\bf v}\in V}\|T({\bf v})-{\bf w}\|^2_2.
$$
* The minimum value is achieved when ${\bf w}$ is projected to $\text{Ran}(T)$, and hence the least squares solution satisfies
$$
T({\bf v}^*)=P_{\text{Ran}(T)}{\bf w}.
$$
* The minimum value is achieved when ${\bf w}$ is projected to $\text{Ran}(T)$, and hence
$$
\left({\bf w}-T({\bf v}^*)\right) \perp \text{Ran}(T).
$$
In other words,
$$
\langle {\bf w}-T({\bf v}^*), T({\bf v})\rangle = 0, \quad \forall {\bf v}\in V.
$$
We can also rewrite it as
$$
\langle T({\bf v}^*), T({\bf v})\rangle = \langle {\bf w}, T({\bf v})\rangle, \quad \forall {\bf v}\in V.
$$
Use the adjoint transformation we obtain
$$
\langle T^*T({\bf v}^*), {\bf v}\rangle = \langle T^*({\bf w}), {\bf v}\rangle, \quad \forall {\bf v}\in V.
$$
That is,
$$
T^*T({\bf v}^*) = T^*({\bf w}),
$$
which is often called the **normal equation** for the least squares solutions.
#### Solvability condition
Notice that $\text{Ran}(T)^\perp=\text{Ker}(T^*)$, which means that if ${\bf w}\in \text{Ran}(T)$, then ${\bf w}\perp\text{Ker}(T^*)$. We then obtain the ***solvability condition*** as
$$
\langle{\bf w}, {\bf u}\rangle = 0, \quad \forall{\bf u}\in\text{Ker}(T^*).
$$