---
title: Ch1-7
tags: Linear algebra
GA: G-77TT93X4N1
---
# Chapter 1 extra note 7
> isomorphism; isomorphic
> subspace
> dimension of a vector space
> replacement theorem
## Selected lecture notes:
### Isomorphic
:::info
**Definition:**
An invertible linear transformation is called an ***isomorphism***.
:::
:::info
**Definition:**
Two vector spaces $V$ and $W$ are called ***isomorphic*** ($V\cong W$) if $\exists$ an isomorphism $T:V\to W$.
:::
**Theorem:**
Let $T:V\to W$ be an isomorphism and $\{v_1, \cdots, v_n\}$ is a basis of $V$, then $\{T(v_1), \cdots, T(v_n)\}$ is a basis of $W$.
* Proof:
> claim 1: $\{T(v_1), \cdots, T(v_n)\}$ is linearly independent
> > pf:
> > $\because$ $T$ is an isomorphism, $\exists T^{-1}:W\to V$.
> > Let us assume
> > $$
> > c_1 T(v_1) + \cdots + c_nT(v_n)=0.
> > $$
> > By applying $T^{-1}$ to both side of the equation, the right hand side gives $T^{-1}(0) = 0$, while the left hand side gives
> > $$
> > T^{-1}\left(c_1 T(v_1) + \cdots + c_nT(v_n)\right)=c_1v_1 + \cdots c_nv_n,
> > $$
> > where we have used the fact that $T^{-1}$ is linear.
> >
> > Since $\{v_1, \cdots, v_n\}$ is a basis, it is linearly independent, so $c_1 = \cdots = c_n=0$.
> > Therefore, $\{T(v_1), \cdots, T(v_n)\}$ is linearly independent.
>
> claim 2: $\{T(v_1), \cdots, T(v_n)\}$ is generating
> > pf:
> > Given $w\in W$, $T^{-1}(w)\in V$.
> > Since $\{v_1, \cdots, v_n\}$ is a basis, there exists $c_1, \cdots, c_n$ such that
> > $$
> > T^{-1}(w) = c_1 v_1 + \cdots c_n v_n.
> > $$
> > By applying $T$ to both side we have
> > $$
> > w = c_1 T(v_1) + \cdots c_nT(v_n).
> > $$
> > So that $w$ can be written as a linear combination of $\{T(v_1), \cdots, T(v_n)\}$.
> >
**Corollary:**
Let $T:V\to W$ be linear and $\{v_1, \cdots, v_n\}$ be a basis of $V$, if $\{T(v_1),\cdots, T(v_n)\}$ is not a basis of $W$, then $T$ is not invertible.
**Theorem:**
Let $V, W$ be vector spaces with bases $\beta=\{v_1, \cdots, v_n\}$ and $\gamma=\{w_1, \cdots, w_n\}$, respectively. We define $T:V\to W$ by
$$
T(c_1v_1 + \cdots c_nv_n) = c_1 w_1 + \cdots c_n w_n.
$$
Then $T$ is an isomorphism.
* Proof:
> $T$ is clearly linear.
>
> Define $S:W\to V$ by
> $$
> S(c_1w_1 + \cdots c_nw_n) = c_1 v_1 + \cdots c_n v_n
> $$
> Since $\gamma$ is a basis, $S$ is well-defined.
> Also we have $T\circ S=I_w$ and $S\circ T=I_v$.
> Therefore, $T$ is both right and left invertible, and is thus an isomorphism.
**Remark:**
$$
[T]^{\gamma}_{\beta} = I_n:\text{the $n\times n$ identity matrix}.
$$
---
### Vector subspace
:::info
**Definition:**
Let $V_0\subseteq V$ where $V$ is a vector space. $V_0$ is a vector subspace is it is a vector space under the same field, vector addition and multiplication.
:::
**Theorem:**
Let $V$ be a vector space and $V_0\subseteq V$, then $V_0$ is a vector subspace if and only if $\alpha {\bf u} + \beta {\bf v}\in V_0$ for all $\alpha, \beta \in F$ and ${\bf u, v}\in V_0$.
#### Example 1
Let $T:V\to W$ be a linear transformation, $\text{Ker}(T)\subseteq V$ is a subspace.
* Proof:
> Given ${\bf u, v}\in \text{Ker}(T)$, $\alpha, \beta \in F$, we have $T({\bf u}) = {\bf 0}$ and $T({\bf v}) = {\bf 0}$.
> Since $T$ is linear and $V$ is a vector space,
> $$
> T(\alpha {\bf u} + \beta {\bf v}) = \alpha T({\bf u}) + \beta T({\bf v}) = \alpha\cdot {\bf 0} + \beta\cdot {\bf 0} = {\bf 0}.
> $$
> So $\alpha {\bf u} + \beta {\bf v}\in\text{Ker}(T)$.
#### Example 2
Let $T:V\to W$ be a linear transformation, $\text{Ran}(T)\subseteq W$ is a subspace.
---
### Dimension of a vector space
**Theorem (replacement theorem):**
Let $V$ be a vector space, $G=\{v_1, \cdots, v_n\}, n\ge 1$ be a generating set of $V$. Let $L=\{w_1, \cdots, w_m\}\subset V$ be linearly independent, then $m\le n$ and $\exists H\subset G$ that has exactly $n-m$ vectors such that $H\cup L$ is a generating set.
* Proof:
> We prove by induction
>
> * $k=1$:
>
> $L=\{w_1\}$ is linearly independent. Then $k=1\le n$.
> Since $G$ is generating, we have,
> $$
> w_1 = c_1v_1 + \cdots c_nv_n.
> $$
> claim: $c_1, \cdots, c_n$ not all zero
> > pf:
> > If $c_1=c_2=\cdots=c_n=0$, then $w_1=0$ and $\{w_1\}$ is not linearly independent.
> > So $c_1, \cdots, c_n$ can not be all zero.
>
> WLOG assume $c_1\ne 0$, then
> $$
> v_1 = \frac{1}{c_1}w_1 - \frac{c_2}{c_1}v_2 - \cdots - \frac{c_n}{c_1}v_n.
> $$
> That is, $v_1\in\text{span}\{w_1, v_2, \cdots, v_n\}$.
> It is then clear that $\text{span}\{v_1, \cdots, v_n\}\subseteq \text{span}\{w_1, v_2, \cdots, v_n\}$, so $\{w_1, v_2, \cdots, v_n\}$ is a generating set.
>
> * Assume $k=m$ is true. Let's check $k=m+1$:
>
> Let $L=\{w_1, \cdots, w_{m+1}\}\subset V$ be linearly independent, then $\tilde{L}=\{w_1, \cdots, w_{m}\}\subset V$ is also linearly independent.
> We then have $m\le n$ and $\exists \tilde{H}\subset G$ that has exactly $n-m$ vectors such that $\tilde{H}\cup \tilde{L}$ is a generating set. WLOG, we may assume $\tilde{H}=\{v_{m+1}, \cdots, v_n\}$.
> Since $\tilde{H}\cup \tilde{L}$ is a generating set, we have
> $$
> w_{m+1} = c_1w_1 + \cdots c_mw_m + c_{m+1}v_{m+1}+\cdots c_nv_n.
> $$
> claim: $c_{m+1}, \cdots, c_n$ not all zero
> > pf:
> > If $c_{m+1}=c_{m+2}=\cdots=c_n=0$, then
> > $$
> > c_1w_1 + \cdots c_mw_m - w_{m+1} = 0.
> > $$
> > There exists non-trivial linear combination to $0$ and $\{w_1, \cdots, w_{m+1}\}$ is not linearly independent.
> > So $c_{m+1}, \cdots, c_n$ can not be all zero.
>
> A direct consequence of this claim is that $\tilde{H}$ can not be an empty set, that is, $n-m>0$, so $m+1\le n$.
>
> Since $c_{m+1}, \cdots, c_n$ can not be all zero, WLOG assume $c_{m+1}\ne 0$, we have
> $$
> v_{m+1} \in\text{span}\{w_1, \cdots, w_{m+1}, v_{m+2}, \cdots, v_n\}.
> $$
> Let us define $H=\{v_{m+2}, \cdots, v_n\}$, it is then clear that $\text{span}\{\tilde{H}\cup \tilde{L}\}\subseteq \text{span}\{H\cup L\}$, so $H\cup L$ is a generating set.
**Corollary:**
If $\{v_1, \cdots, v_n\}$ and $\{w_1, \cdots, w_m\}$ are both bases of a vector space $V$, then $n=m$.
* Proof:
> $\{v_1, \cdots, v_n\}$ is a generating set and $\{w_1, \cdots, w_m\}$ is linearly independent, so $m\le n$.
> Similarly, $\{w_1, \cdots, w_m\}$ is a generating set and $\{v_1, \cdots, v_n\}$ is linearly independent, so $n\le m$.
> Therefore, $n=m$.
:::info
**Definition:**
Let $V$ be a vector space and $\{v_1, \cdots, v_n\}\subset V$ be a basis, we define $\text{dim}(V)=n$.
:::