---
title: Ch1-6
tags: Linear algebra
GA: G-77TT93X4N1
---
# Chapter 1 extra note 6
> Invertible transformation
> one-to-one
> onto
## Selected lecture notes:
### Invertible transformation
:::info
**Definition:**
Identity transformation: $I_v:V\to V$, $I_v({\bf v})={\bf v}$, $\forall {\bf v}\in V$.
:::
**Proposition:**
For any linear transformation, $T:V\to W$, we have $T\circ I_v = T$ and $I_w\circ T = T$.
* Proof:
> Given ${\bf v}\in V$,
$$
(T\circ I_v)({\bf v}) = T(I_v({\bf v})) = T({\bf v}).
$$
> Similarly,
$$
(I_w \circ T)({\bf v}) = I_w(T({\bf v})) = T({\bf v}).
$$
:::info
**Definition:**
Let $T:V\to W$ be a linear transformation, $T$ is ***left invertible*** if $\exists B:W \to V$ such that $B\circ T = I_v$.
:::
:::info
**Definition:**
Let $T:V\to W$ be a linear transformation, $T$ is ***right invertible*** if $\exists C:W \to V$ such that $T\circ C = I_w$.
:::
:::info
**Definition:**
Let $T:V\to W$ be a linear transformation, $T$ is ***invertible*** if it is both left and right invertible.
:::
**Theorem:**
Let $T:V\to W$ be a linear transformation, $T$ is ***invertible*** if and only if there exists an unique left inverse $B$ and an unique right inverse $C$. Moreover, $B=C$.
* Proof:
> ($\Leftarrow$) True by definition.
>
> ($\Rightarrow$)
> Let $T:V\to W$ be an invertible linear transformation, then $\exists B$ and $C$ such that $B\circ T = I_v$ and $T\circ C = I_w$.
> Given ${\bf w}\in W$,
> $$
> B({\bf w}) = B(T(C({\bf w}))) = (B\circ T)(C({\bf w})) = C({\bf w}).
> $$
> Therefore $B=C$.
>
> Suppose $\exists B_2:W\to V$ such that $B_2\circ T = I_v$, then, given ${\bf w}\in W$,
> $$
> B_2({\bf w}) = B_2(T(C({\bf w}))) = (B_2\circ T)(C({\bf w})) = C({\bf w}).
> $$
> Therefore $B_2=C=B$. So the left inverse is unique.
>
> Similarly, the right inverse is unique.
**Corollary:**
$T:V\to W$ is an invertible linear transformation if and only if $\exists T^{-1}:W\to V$ such that $T^{-1}\circ T = I_v$ and $T\circ T^{-1}=I_w$.
**Corollary:**
Let $A:V\to W$ and $B:U\to V$ be invertible linear transformations, then $A\circ B:U\to W$ is an invertible linear transformation and $(A\circ B)^{-1}=B^{-1}\circ A^{-1}$.
* Proof:
> $$
> (A\circ B)\circ (B^{-1}\circ A^{-1}) = A\circ(B\circ B^{-1})\circ A^{-1} = A\circ I_v\circ A^{-1}=I_w.
> $$
> and
> $$
> (B^{-1}\circ A^{-1})\circ (A\circ B) = B^{-1}\circ(A^{-1}\circ A)\circ B = B^{-1}\circ I_v\circ B=I_u.
> $$
---
### Invertibility and solvability
**Theorem:**
Let $T:V\to W$ be a linear transformation, $T$ is invertible if and only if $\forall b\in W$, the equation $T(x)=b$ has an unique solution.
* Proof:
> ($\Rightarrow$)
> $T$ is invertible, so $\exists T^{-1}:W\to V$ that is linear.
> Given $b\in W$, define $x=T^{-1}(b)$, then
> $$
> T(x) = T(T^{-1}(b))=(T\circ T^{-1})(b) = b.
> $$
>
> To check uniqueness, suppose $\exists x_2\in V$ such that $T(x_2)=b$, then
> $$
> x_2 = (T^{-1}\circ T)(x_2) = T^{-1}(T(x_2)) = T^{-1}(b) = x.
> $$
> Therefore, the solution is unique.
>
> ($\Leftarrow$)
> We define a function $B:W\to V$ such that
> $$
> B(b)=x, \quad \text{if}\,\, T(x)=b.
> $$
> This function $B$ is well-defined since for any $b\in W$, $\exists ! x\in V$ such that $T(x)=b.$
>
> claim 1: $B\circ T = I_v$
> > pf:
> > Let $x\in V$ and $T(x)=b\in W$.
> > Since $T(x)=b$ has only one solution, therefore $B(b)=x$.
> > Then we have
> > $$
> > B(T(x)) = B(b) = x.
> > $$
>
> claim 2: $T\circ B = I_w$
> > pf:
> > Let $b\in W$, $\exists ! x\in V$ such that $T(x)=b$, hence $B(b)=x$, and
> > $$
> > T(B(b)) = T(x) = b.
> > $$
>
> claim 3: $B$ is linear
> > pf:
> > Given $b_1, b_2\in W$, $\exists ! x_1, x_2$ such that $T(x_1)=b_1$ and $T(x_2)=b_2$.
> > Since $T$ is linear and $T\circ B = I_w$, we have
> > $$
> > T(\alpha_1 B(b_1) + \alpha_2 B(b_2)) = \alpha_1 T(B(b_1)) + \alpha_2T(B(b_2)) = \alpha_1 b_1 + \alpha_2 b_2.
> > $$
> > Therefore, $B(\alpha_1 b_1 + \alpha_2 b_2) = \alpha_1 B(b_1) + \alpha_2 B(b_2)$.
### One-to-one and onto
:::info
**Definition:**
$T:V\to W$ is one-to-one (1-1, injective) if $T({\bf u})=T({\bf v})$ implies ${\bf u}={\bf v}$.
:::
**Proposition:**
$T:V\to W$ is one-to-one if and only if $\text{Ker}(T)=\{{\bf 0}\}$.
* Proof:
> ($\Rightarrow$)
> $T$ is linear so $T({\bf 0})={\bf 0}$.
> Suppose $\exists {\bf v}\in V$ such that $T({\bf v})={\bf 0}$,
> since $T$ is one-to-one, we must have ${\bf v} = {\bf 0}$.
>
> ($\Leftarrow$)
> Let ${\bf u, v}\in V$ such that $T({\bf u}) = T({\bf v})$.
> Since $T$ is linear
> $$
> {\bf 0} = T({\bf u}) - T({\bf v}) = T({\bf u} - {\bf v}).
> $$
> So $({\bf u} - {\bf v})\in\text{Ker}(T)$ and we must have ${\bf u} - {\bf v}={\bf 0}$, that is, ${\bf u} = {\bf v}$.
**Proposition:**
If $T:V\to W$ is left invertible, then $T$ is one-to-one.
* Proof:
> Given ${\bf u, v}\in V$ such that $T({\bf u})=T({\bf v})$.
> $T$ is left invertible, so $\exists B:W\to V$ such that $B\circ T=I_v$.
> Then
> $$
> {\bf u} = B(T({\bf u})) = B(T({\bf v})) = {\bf v}.
> $$
**Proposition:**
If $T:V\to W$ is not one-to-one, $T$ is not left invertible.
:::info
**Definition:**
$T:V\to W$ is onto (surjective) if for any ${\bf w}\in W$, there exists a ${\bf v}\in V$ such that $T({\bf v})={\bf w}$.
:::
**Proposition:**
If $T:V\to W$ is right invertible, then $T$ is onto.
* Proof:
> Since $T$ is right invertible, $\exists C:W\to V$ such that $T\circ C=I_w$.
> Given ${\bf w}\in W$, we define ${\bf v} = C({\bf w})$,
> Therefore $T({\bf v}) = T(C({\bf w})) = {\bf w}$.
**Proposition:**
If $T:V\to W$ is not onto, $T$ is not right invertible.
**Theorem:**
Let $T:V\to W$ be a linear transformation, $T$ is invertible if and only if $T$ is one-to-one and onto.
* Proof:
> ($\Rightarrow$)
> Given $T\in\mathcal{L}(V, W)$ that is invertible, then $\exists T^{-1}\in\mathcal{L}(W, V)$.
>
> claim: $T$ is one-to-one
> > pf:
> > Suppose there exists ${\bf v}\in V$ such that $T({\bf v})={\bf 0}$.
> > Since $T^{-1}$ is linear that maps ${\bf 0}$ to ${\bf 0}$, we have
> > $$
> > {\bf v} = T^{-1}(T({\bf v})) = T^{-1}({\bf 0}) = {\bf 0}.
> > $$
> > Therefore, $\text{Ker}(T) = \{{\bf 0}\}$ and $T$ is one-to-one.
>
> claim: $T$ is onto
> > pf:
> > Given ${\bf w}\in W$, we have ${\bf w} = T(T^{-1}({\bf w}))$, that is, ${\bf w}\in \text{Ran}(T)$.
> > It is then clear that $\text{Ran}(T) = W$ and $T$ is onto.
>
> ($\Leftarrow$)
>$T$ is onto, so, for each ${\bf w}\in W$, $\exists {\bf v}\in V$ such that $T({\bf v}) = {\bf w}$.
> We then define a function $S:W\to V$ by setting $S({\bf w})={\bf v}$.
>
> claim: $S$ is a right inverse of $T$
> > pf:
> > Given ${\bf w}\in W$, let $S({\bf w}) = {\bf u}$, then $T({\bf u})={\bf w}$ and
> > $$
> > (T\circ S)({\bf w}) = T(S({\bf w})) = T({\bf u}) = {\bf w}.
> > $$
>
> claim: $S$ is a left inverse of $T$
> > pf:
> > Given ${\bf v}\in V$, let $T({\bf v})={\bf w}$, and assume $S({\bf w})={\bf u}$, i.e., $T({\bf u})={\bf w}$.
> > Since $T$ is one-to-one, we must have ${\bf v}={\bf u}$ and therefore
> > $$
> > (S\circ T)({\bf v}) = S(T({\bf v})) = S({\bf w}) = {\bf u}= {\bf v}.
> > $$
>
> claim: $S$ is a linear transformation
> > pf:
> > Given ${\bf w}_1, {\bf w}_2\in W$ and $\alpha_1, \alpha_2\in F$, since $T$ is linear and $T\circ S = I_w$, we have
> > $$
> > T(\alpha_1 S({\bf w}_1) + \alpha_2 S({\bf w}_2)) = \alpha_1T(S({\bf w}_1)) + \alpha_2T(S({\bf w}_2)) ] =\alpha_1{\bf w}_1 + \alpha_2{\bf w}_2.
> > $$
> > Since $T$ is one-to-one, we must have
> > $$
> > S(\alpha_1{\bf w}_1 + \alpha_2{\bf w}_2) = \alpha_1 S({\bf w}_1) + \alpha_2 S({\bf w}_2).
> > $$
> > So $S$ is a linear transformation.
---
### Examples
#### Example 1
Let $T\in\mathcal{L}(\mathbb{P}_2, \mathbb{P}_2)$ with $T(p) = p'+p$.
Choose $\beta = \{1, x, x^2\}$ be a basis for $\mathbb{P}_2$. We have
$$
\begin{aligned}
T(1) = 1 &=
\begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} 1 \\ 0 \\ 0
\end{bmatrix},\\
T(x) = 1+x &=
\begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} 1 \\ 1 \\ 0
\end{bmatrix},\\
T(x^2) = 2x + x^2 &=
\begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} 0 \\ 2 \\ 1
\end{bmatrix}.
\end{aligned}
$$
So, the matrix representation of $T$ is
$$
[T]_{\beta} = \begin{bmatrix} 1 & 1 & 0 \\ 0 & 1 & 2 \\ 0 & 0 & 1
\end{bmatrix}.
$$
It is then easy to find that
$$
[T^{-1}]_{\beta} = \begin{bmatrix} 1 & -1 & 2 \\ 0 & 1 & -2 \\ 0 & 0 & 1
\end{bmatrix}.
$$
Therefore,
$$
\begin{aligned}
T^{-1}(a + bx + cx^2) &= T^{-1}\left(\begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} a \\ b \\ c
\end{bmatrix}\right)\\
&= \begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} a-b+2c \\ b-2c \\ c
\end{bmatrix} \\
&= (a-b+2c) + (b-2c)x + cx^2.
\end{aligned}
$$
We can also determine the adjoint transformation of $T$, we have
$$
[T^*]_{\beta} = \begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 0 & 2 & 1
\end{bmatrix}.
$$
Hence,
$$
\begin{aligned}
T^*(a + bx + cx^2) &= T^*\left(\begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} a \\ b \\ c
\end{bmatrix}\right)\\
&= \begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} a \\ a+b \\ 2b+cc
\end{bmatrix} \\
&= a + (a+b)x + (2b+c)x^2.
\end{aligned}
$$
We can also use a different basis for $\mathbb{P}_2$. For example, choose $\gamma = \{1+x+x^2, x+x^2, x^2\}$. The matrix representation of $T$ is then
$$
[T]^{\gamma}_{\beta} = \begin{bmatrix} 1 & 1 & 0 \\ -1 & 0 & 2 \\ 0 & -1 & -1
\end{bmatrix},
$$
and
$$
[T^{-1}]_{\gamma}^{\beta} = \begin{bmatrix} 2 & 1 & 2 \\ -1 & -1 & -2 \\ 1 & 1 & 1
\end{bmatrix}.
$$
We again have
$$
\begin{aligned}
T^{-1}(a + bx + cx^2) &= T^{-1}\left(\begin{bmatrix} 1+x+x^2 & x+x^2 & x^2
\end{bmatrix}\begin{bmatrix} a \\ b-a \\ c-b
\end{bmatrix}\right)\\
&= \begin{bmatrix} 1 & x & x^2
\end{bmatrix}\begin{bmatrix} a-b+2c \\ b-2c \\ c
\end{bmatrix} \\
&= (a-b+2c) + (b-2c)x + cx^2.
\end{aligned}
$$
#### Example 2
Let $T\in\mathcal{L}(\mathbb{P}_2, \mathbb{P}_2)$ with $T(p) = p'$.
It is easy to check that $1\in\text{Ker}(T)$, so $T$ is not one-to-one, thus is not left invertible and is not invertible.
Also, $x^2\notin\text{Ran}(T)$, so $T$ is not onto and is thus not right invertible.
#### Example 3
Let $T\in\mathcal{L}(\mathbb{P}_2, \mathbb{P}_1)$ with $T(p) = p'$.
Again, $1\in\text{Ker}(T)$, so $T$ is not one-to-one, thus is not left invertible and is not invertible.
But $T$ is onto, and we can define $C\in\mathcal{L}(\mathbb{P}_1, \mathbb{P}_2)$ with $C(p)=\int^x_0p(s)\,dx$.
One can check easily that $T\circ C(p) = p$, $\forall p\in\mathbb{P}_1$.
#### Example 4
Let $T\in\mathcal{L}(V, V)$ with $T(p) = p'$, where $\beta = \{\sin(x), \cos(x)\}$ is a basis of $V$.
Then $T$ is on-to-one and onto, and invertible.