---
tags: Giang's linear algebra notes
---
# Chapter 2: Bases
[toc]
## Bases
**A basis of vector space $V$** is a set $S$ which spans $V$, while also being linearly independent. In other words, $S$ is the bare minimal set of vectors which spans $V$
**Vector representation via basis**: Let $\{v_{1},\dots,v_{n}\}$ be a basis of a vector space $V$. Then, every $v\in V$ can be written uniquely as
$$v=\sum_{i=1}^{n}a_{i}v_{i}$$
for some $\{a_{i}\}_{i=1}^{n}$ with $a_{i}\in\textbf{R}$
*Proof*
Assume that $v=\sum_{i=1}^{n}a_{i}v_{i}$ for some $v\in V$. Also, assume that $v=\sum_{i=1}^{n}b_{i}v_{i}$ where
$$\exists i\in\{1,\dots,n\},a_{i}\neq b_{i}$$
We have that
$$\begin{aligned} & \sum_{i=1}^{n}a_{i}v_{i}=\sum_{i=1}^{n}b_{i}v_{i}\\ \Leftrightarrow & \sum_{i=1}^{n}(a_{i}-b_{i})v_{i}=0\end{aligned}$$
Since $b_{i}\neq a_{i}$ for some $i\in\{1,\dots,n\}$, $(a_{i}-b_{i})$ are not all zeros. This inplies that $\{v_{i}\}_{i=1}^{n}$ are not linearly independent, thus not a basis of $V$. This is a contradiction
- [ ]
## Construction of basis
**Bottom-up approach for finding a basis of a vector space $V$**: Start with a set of one vector only, then gradually add linearly independent vectors into the set until that set spans $V$
**Top-down approach for finding a basis of a vector space $V$**: Start with a big vector space, then impose constraints to cut down the space, in size, to obtain the target vector space
>**NOTE**: The bottom-up approach is prefered for building a basis for a vector space $V$
**Theoretical foundation for bottom-up approach**: Let $V$ be a vector space, $S$ be a linearly indendent subset of $V$, and $v\notin S$ is a vector in $V$. Then
1. If $v\in\text{span}(S)$, then $S\cup\{v\}$ is linearly dependent, and $\text{span}(S\cup\{v\})=\text{span}(S)$
2. If $v\notin\text{span}(S)$, then $S\cup\{v\}$ is linearly independent, and $\text{span}(S\cup\{v\})\supset\text{span}(S)$
*Proof*
Let $S=\{v_{1},\dots,v_{n}\}$, we have that
$$v\in\text{span}(S)\Leftrightarrow v=\sum_{i=1}^{n}a_{i}v_{i}$$
for some $\{a_{i}\}_{i=1}^{n}$. Thus $$\sum_{i=1}^{n}a_{i}v_{i}-v=0$$ By the definition of linearly dependencies, this implies that $\{v,v_{1},\dots,v_{n}\}$ is linearly dependent. In case $v\notin\text{span}(S)$, assume that
$$av+\sum_{i=1}^{n}a_{i}v_{i}=0$$
where $a,a_{1},\dots,a_{n}$ are not all zeros. If $a\neq0$ then
$$v=\sum_{i=1}^{n}\frac{a_{i}}{a}v_{i}\in\text{span}(S)$$
This leads to contradiction. In case $a=0$ and, without loss of generality $a_{1}\neq0$, then $\{v_{1},\dots,v_{n}\}$ is not linearly independent, which is a contradiction also. Thus
$$av+\sum_{i=1}^{n}a_{i}v_{i}=0$$
if and only if $a=a_{1}=\dots=a_{n}=0$. This implies $\{v,v_{1},\dots,v_{n}\}$ is linearly independent
- [ ]
**Corollary**: If a linearly independent set $S$ does not span $V$, then we can make the span bigger by adding $v\notin\text{span}(S)$
>**NOTE**: The theorem above is the theoretical foundation for the correctness of the bottom-up approach for building basis
## Dimension
A vector space may have several bases but all bases have the same number of vectors
**Replacement theorem**: Let $V$ be a vector space and $S$ is a finite subset of $V$ with $\text{span}(S)=V$ and $|S|=n$. Let $L$ be another linearly independent subset of $V$ and $|L|=m$. Then
$$m\leq n$$
and
$$\exists S'\subseteq S,|S'|=n-m\land\text{span}(S'\cup L)=V$$
*Proof*
We prove the theorem by induction on $m$. We have that the theorem holds for the base case $m=0$. Assume that the theorem holds for $m-1$. In case $m-1=n$, then
$$\text{span}(\{\}\cup L_{1:m-1})=\text{span}(L_{1:m-1})=V$$
where $L_{1:m-1}=\{l_{1},\dots,l_{m-1}\}$. Now consider $L=L_{1:m-1}\cup\{l_{m}\}$. We have that $L$ is a subset of $V$, but $\text{span}(L_{1:m-1})=V$ thus $l_{m}$ then be linearly dependent of $L_{1:m-1}$ (by *theoretical foundation of bottom-up approach for building basis*). Thus $m\leq n$ must hold. Consider the case where $0<m\leq n$, without loss of generality, we assume that
$$\text{span}(L_{1:m-1}\cup S_{1:n-m+1})=V$$
We have that $\{l_{1},\dots,l_{m},s_{1},\dots,s_{n-m+1}\}$ is linearly dependent since $l_{m}\in\text{span}(L_{1:m-1}\cup S_{1:n-m+1})$. Thus
$$\sum_{i=1}^{m}a_{i}l_{i}+\sum_{i=1}^{n-m+1}b_{i}s_{i}=0$$
for some $\{a_{i}\}_{i=1}^{m}\cup\{b_{i}\}_{i=1}^{n-m+1}$ with not all zeros. If $a_{m}=0$ then, by the definition of linearly dependencies, $a_{1}=\dots=a_{m}=b_{1}=\dots=b_{n-m+1}=0$. Thus, $L\cup S_{1:n-m+1}$ is linearly independent. But this is a contradiction since
$$l_{m}\in\text{span}(L_{1:m-1}\cup S_{1:n-m+1})$$
Thus $a_{m}\neq0$. However, $b_{1}=\dots=b_{n-m+1}=0$ does not hold also, otherwise, $a_{m}=0$ holds due to linear independency of $L$, thus lead to contradiction. Without loss of generality, we assume that $b_{1}\neq0$. We now have that
$$l_{m}=\sum_{i=1}^{m-1}\frac{a_{i}}{a_{m}}l_{i}+\sum_{i=1}^{n-m+1}\frac{b_{i}}{a_{m}}s_{i}$$
We have that
$$\begin{aligned}\forall v\in V,v & =\sum_{i=1}^{m-1}\hat{a}_{i}l_{i}+\sum_{i=1}^{n-m+1}\hat{b}_{i}s_{i}\\ & =\sum_{i=1}^{m-1}(\hat{a}_{i}-\frac{\hat{b}_{1}}{b_{1}}a_{i})l_{i}+\frac{\hat{b}_{1}}{b_{1}}a_{m}l_{m}+\sum_{i=2}^{n-m+1}(\hat{b}_{i}-\frac{\hat{b}_{1}}{b_{1}}a_{m})s_{i}\\ & \in\text{span}(L\cup S_{2:n-m+1})\end{aligned}$$
Thus we have that $\text{span}(V)\subseteq\text{span}(L\cup S_{2:n-m+1})$, we also have that $\text{span}(L\cup S_{2:n-m+1})\subseteq V$ since $L\subseteq V$ and $S\subseteq V$. Thus
$$\text{span}(V)=\text{span}(L\cup S_{2:n-m+1})$$
must hold. This proves the second proposition.
- [ ]
**Basis size**: Let $V$ be a vector space and $B$ be a finite basis of
$V$ where $|B|=d$. Then
1. Any set $S\subseteq V$, where $|S|<d$, cannot span $V$, i.e. every spanning set of $V$ must contain at least $d$ elements
2. Any set $S\subset V$, where $|S|>d$, must be linearly dependent, i.e. every linearly independent set in $V$ can contain at most $d$ elements
**Direct consequences of *Basis size*** are
1. Any basis of $V$ must consist of exactly $d$ elements
2. Any spanning set of $V$, with exactly $d$ elements, forms a basis
3. Any set of $d$ linearly independent of $V$ forms a basis
**Indirect consequences of corrolary *Basis size*** are
1. Any set of linearly independent elements of $V$ is contained in a basis
2. Any spanning set of $V$ contains a basis
**Dimension**: A vector space $V$ has dimension $d$ if it contains a basis of $d$ elements. We denote
$$\text{dim}(V)=d$$
>**NOTE**: $V$ cannot have multiple different dimensions simultaneously
>**NOTE**: $\text{dim}(V)$ can be understood as the number of degrees of freedom inherent in $V$, or the number of possible linearly independent vectors in $V$
**Finite and infinite dimensional vector spaces**: When $\text{dim}(V)$ is finite, we say $V$ is *a finite-dimensional vector space*, otherwise $V$ is *an infinite-dimensional vector space*
**Subspaces and dimension**: Let $V$ be a finite-dimensional vector space and $W$ be a subspace of $V$. Then
- $W$ is also finite-dimensional
- $\text{dim}(W)\leq\text{dim}(V)$
- The only way that $\text{dim}(W)=\text{dim}(V)$ is when $W=V$
## Lagrange interpolation
Lagrange interpolation is an application of the abstract theory to a basic problem: how to fit a polynomial to a specified number of points
**Lemma**: Let $S=\{(x_{i},y_{i})\}_{i=1}^{n}\subset\textbf{R}^{2}$ and $f_{j}\in P_{n-1}(\textbf{R})$ is a polynomial of degree $n-1$ satisfying
$$f_{j}(x_{j})=1\land\forall k\neq j,f_{j}(x_{k})=0$$
Then the set $\{f_{1},\dots,f_{n}\}$ is a basis of $P_{n-1}(\textbf{R})$
*Proof*
We have that $$\text{dim}[P_{n-1}(\textbf{R})]=n$$ since $P_{n-1}(\textbf{R})$ has a basis $\{1,x,\dots,x^{n-1}\}$. We also have that $\{f_{1},\dots,f_{n}\}$ is a set of $n$ linearly independent vectors from $P_{n-1}(\textbf{R})$. Thus $\{f_{1},\dots,f_{n}\}$ is a basis of $P_{n-1}(\textbf{R})$
- [ ]
**Lagrange interpolation formula**: Let $S=\{(x_{i},y_{i})\}_{i=1}^{n}\subset\textbf{R}^{2}$ where $n\geq1$. Then there exists a unique polynomial $f\in P_{p^{n}}(\textbf{R})$ of degree at most $n-1$ so that $$\forall(x,y)\in S,y=f(x)$$ and $f$ is given by
$$f(x)=\sum_{j=1}^{n}\frac{\prod_{1\leq k\leq n:k\neq j}(x-x_{k})}{\prod_{1\leq k\leq n:k\neq j}(x_{j}-x_{k})}y_{j}$$
$f$ is now called the interpolating polynomial for $\{(x_{i},y_{i})\}_{i=1}^{n}$
*Proof*
We have that the desired polynomial $f$ has the property
$$f(x)=\sum_{i=1}^{n}y_{i}f_{i}(x)$$
Thus $f\in P_{n-1}(\textbf{R})$ and since $\{f_{1},\dots,f_{n}\}$ is a basis of $P_{n-1}(\textbf{R})$ (by the lemma above), we have that $\{y_{1},\dots,y_{n}\}$ is unique, hence $f$. Now we have that
$$\forall j\neq i,f_{i}(x_{j})=0$$
Thus
$$f_{i}(x)=Q(x)\prod_{1\leq j\leq n:j\neq i}(x-x_{i})$$
We have that $f_{i}\in P_{n-1}(\textbf{R})$, thus $Q(x)$ must be a constant, i.e. $f_{i}(x)=c_{i}\cdot\prod_{1\leq j\leq n:i\neq j}(x-x_{i})$. We have that $f_{i}(x_{i})=1$, thus
$$\begin{aligned} & f_{i}(x_{i})=c_{i}\cdot\prod_{1\leq j\leq n:j\neq i}(x_{i}-x_{j})=1\\ \Leftrightarrow & c_{i}=\frac{1}{\prod_{1\leq j\leq n:j\neq i}(x_{i}-x_{j})}\end{aligned}$$
Thus we conclude that
$$f_{i}(x)=\frac{\prod_{1\leq j\leq n:j\neq i}(x-x_{i})}{\prod_{1\leq j\leq n:j\neq i}(x_{i}-x_{j})}$$
Thus we have that
$$f(x)=\sum_{j=1}^{n}\frac{\prod_{1\leq k\leq n:k\neq j}(x-x_{k})}{\prod_{1\leq k\leq n:k\neq j}(x_{j}-x_{k})}y_{j}$$
- [ ]
**Uniqueness of Lagrange interpolation function**: There is exactly one polynomial of degree at most $n-1$, which passes through $n$ given points. In case of polynomials with degree higher than $n-1$, there are infinitely many such polynomials which satisfies
$$\forall(x,y)\in S,y=f(x)$$
**Interpolation**:
1. When the number of degrees of freedom, i.e. variables, exceeds the numebr of constraints, i.e equations, one has many solutions to a problem
2. When the number of degrees of freedom, i.e. variables, is below the numebr of constraints, i.e equations, one usually has no solutions to a problem
3. When the number of degrees of freedom, i.e. variables, equals to the numebr of constraints, i.e equations, one usually has exactly one solution to a problem
## Expansions
>**NOTE**: $\emptyset$ is a basis of zero vector space $\{\textbf{0}\}$
**Additive map**: A map which preserves vector addition
**Homogeneous map**: A map which preserves scalar multiplication