---
title: Ch1-2
tags: Linear algebra
GA: G-77TT93X4N1
---
# Chapter 1 extra note 2
> Linear combination
> bases
## Selected lecture notes:
:::info
**Definition:**
$\{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ is called a ***basis*** (of the vector space $V$) if any ${\bf v}\in V$ admits a unique representation as a linear combination
$$
{\bf v} = \sum^n_{k=1} \alpha_k {\bf v}_k.
$$
The coefficients $\alpha_1, \cdots, \alpha_n$ are called ***coordinates*** of the vector ${\bf v}$.
:::
:::info
**Definition:**
$\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is called a ***generating system*** (or ***spanning system***) in $V$ if any ${\bf v}\in V$ admits representation as a linear combination
$$
{\bf v} = \sum^p_{k=1} \alpha_k {\bf v}_k.
$$
:::
**Remark:**
The representation in a generating system may not be unique.
:::info
**Definition:**
A linear combination $\alpha{\bf v}_1+\cdots+\alpha_p{\bf v}_p$ is called a ***trivial combination*** if $\alpha_k=0$ for all $k$.
:::
:::info
**Definition:**
$\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is called ***linearly independent*** if only the trivial combination of ${\bf v}_1, \cdots, {\bf v}_p$ gives ${\bf 0}$. In other words, $\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is called ***linearly independent*** if the equation
$$
\alpha{\bf v}_1+\cdots+\alpha_p{\bf v}_p = {\bf 0}
$$
has only the zero solution $\alpha_1=\cdots=\alpha_p=0$.
:::
:::info
**Definition:**
$\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is called ***linearly dependent*** if it is not linearly independet. In other words, $\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is called ***linearly dependent*** if there exists $\alpha_1,\cdots,\alpha_p$ not all zero such that
$$
\alpha{\bf v}_1+\cdots+\alpha_p{\bf v}_p = {\bf 0}.
$$
:::
**Proposition 2.6:**
$\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is ***linearly dependent*** if and only if one of the vector ${\bf v}_k$ can be written as a linear combination of the other vectors.
* Proof:
> ($\Rightarrow$)
> Suppose $\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is linearly dependent, then there exists $\alpha_1,\cdots,\alpha_p$ not all zero such that
$$
\alpha{\bf v}_1+\cdots+\alpha_p{\bf v}_p = {\bf 0}.
$$
Let $k$ be the index such that $\alpha_k\ne 0$, then
$$
\alpha_k{\bf v}_k = \sum^p_{i=1, i\ne k} -\alpha_i {\bf v}_i.
$$
Since $\alpha_k\ne 0$, we can devide both side by $\alpha_k$ to have
$$
{\bf v}_k = \sum^p_{i=1, i\ne k} -\frac{\alpha_i}{\alpha_k} {\bf v}_i.
$$
Therefore, ${\bf v}_k$ is a linear combination of the other vectors.
>
> ($\Leftarrow$) See the proof in textbook.
**Proposition 2.7:**
$\{{\bf v}_1, \cdots, {\bf v}_p\}\subset V$ is a basis if and only if it is linearly independent and generating.
* Proof:
> ($\Rightarrow$) True by our definition of a basis.
>
> ($\Leftarrow$) See the proof in textbook.
**Proposition 2.8:**
Any (finite) generating system contains a basis.
* Proof:
> Suppose $S = \{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ is a generating set. If it is linearly independent, then by **Proposition 2.7** it is a basis. Done.
>
> Assume $S$ is not linearly independent, then it's linearly dependent, by **Proposition 2.6** there is a $k$ and $\{\alpha_i\}$ such that
> $$
> {\bf v}_k = \sum_{i\ne k} \alpha_i {\bf v}_i.
> $$
>
> *Claim:* $S_1 = S\setminus \{{\bf v}_k\}$ is also generating.
> > pf:
> > Given ${\bf v}\in V$, since $S$ is generating, we have
> > $$
> > {\bf v} = \sum_i c_i{\bf v}_i.
> > $$
> > We then have
> > $$
> > {\bf v} = \sum_{i\ne k} c_i{\bf v}_i + c_k{\bf v}_k = \sum_{i\ne k} c_i{\bf v}_i + c_k\sum_{i\ne k} \alpha_i {\bf v}_i = \sum_{i\ne k} (c_i + c_k\alpha_i) {\bf v}_i.
> > $$
> > So ${\bf v}$ can also be presented as a linear combination of elements in $S_1$. Therefore $S_1$ is generating.
>
> We can repeat the process. If $S_1$ is linearly independet, we are done. If not, there is one vector we can remove to have again a generating system.
>
> Finally, since sets with only one non-zero vector are linearly independent, the above procedure will not result in an empty set.
## Excises
##### 2.2( a):
Any set containing a zero vector is linearly dependent.
* proof:
> Let $S = \{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ and $\exists i\in\{1, \cdots, n\}$ such that ${\bf v}_i = {\bf 0}$.
> Choose
> $$
> \alpha_1 = \alpha_2 = \cdots = \alpha_{i-1} = \alpha_{i+1}=\cdots = \alpha_n = 0,
> $$
> and $\alpha_i=1$. Since $1\cdot {\bf 0}={\bf 0}$ and $c\cdot {\bf 0} = {\bf 0}$ for any $c\in F$, we have
> $$
> \alpha_1 {\bf v}_1 + \cdots +\alpha_n {\bf v}_n = {\bf 0}.
> $$
> That is, there exists non-trivial combination to the zero vector. So $S$ is linearly dependent.
* proof: (version 2)
> WLOG, let $S = \{{\bf 0}, {\bf v}_1, \cdots, {\bf v}_n\}\subset V$.
> Choose $\alpha_1 = \alpha_2 = \cdots = \alpha_n=0$, and since $1\cdot {\bf 0}={\bf 0}$ and $c\cdot {\bf 0} = {\bf 0}$ for any $c\in F$, we have
> $$
> 1\cdot{\bf 0} + \alpha_1 {\bf v}_1 + \cdots +\alpha_n {\bf v}_n = {\bf 0}.
> $$
> That is, there exists non-trivial combination to the zero vector. So $S$ is linearly dependent.
##### 2.2( c):
Subsets of linearly dependent sets are linearly dependent.
* The statement is wrong. A simple counterexample is a set $S = \{{\bf v}, 2{\bf v}\}$, ${\bf v}\ne {\bf 0}$. Then $S_1 = \{{\bf v}\}\subset S$. It should be clear that $S$ is linearly dependent and its subset $S_1$ is linearly independent.
##### 2.2( d):
Subsets of linearly independent sets are linearly independent.
* proof:
> Let $S = \{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ be linearly independent.
> Let $s = \{{\bf v}_{i_1}, \cdots, {\bf v}_{i_m}\}\subset S$, where $1\le i_1 < \cdots < i_m\le n$.
> Assuming that $\exists \alpha_1, \cdots, \alpha_m$ not all zero such that
> $$
> \alpha_1{\bf v}_{i_1} + \cdots + \alpha_m{\bf v}_{i_m} = {\bf 0}.
> $$
> Then
> $$
> \alpha_1{\bf v}_{i_1} + \cdots + \alpha_m{\bf v}_{i_m} + \sum_{k\ne i_j, \forall j} 0\cdot {\bf v}_k = {\bf 0}.
> $$
> So there exists a not-all-zero combination of elements of $S$ to the zero vector. Bit it is not possible since $S$ is linearly independent.
> Therefore it is not possible $s$ is linearly dependent. $s$ must be linearly independent.
* proof: (version 2)
> Let $S = \{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ be linearly independent.
> WLOG, assume $s = \{{\bf v}_1, \cdots, {\bf v}_m\}$ be a subset of $S$, $m\le n$.
> Suppose $s$ is linearly dependent, $\exists \alpha_1, \cdots, \alpha_m$ not all zero such that
> $$
> \alpha_1{\bf v}_{1} + \cdots + \alpha_m{\bf v}_{m} = {\bf 0}.
> $$
> Then
> $$
> \alpha_1{\bf v}_{1} + \cdots + \alpha_m{\bf v}_{m} + \sum^n_{k=(m+1)} 0\cdot {\bf v}_k = {\bf 0}.
> $$
> So there exists a not-all-zero combination of elements of $S$ to the zero vector. Bit it is not possible since $S$ is linearly independent.
> Therefore it is not possible $s$ is linearly dependent. $s$ must be linearly independent.
##### 2.5:
Let ${\bf v_1, v_2, \cdots, v_r}\in V$ be linearly independent but not generating. There exists ${\bf v_{r+1}}\in V$ such that the system $\{{\bf v_1, v_2, \cdots, v_r, v_{r+1}}\}$ is linearly independent.
* proof:
> Since $\{{\bf v_1, v_2, \cdots, v_r}\}$ is not generating, there exists ${\bf v_{r+1}}\in V$ such that
> $$
> \tag{1}
> {\bf v_{r+1}}\notin\text{span}\{{\bf v_1, v_2, \cdots, v_r}\}.
> $$
> Suppose $\{{\bf v_1, v_2, \cdots, v_r, v_{r+1}}\}$ is linearly dependent, then there exists $\alpha_1, \cdots, \alpha_{r+1}$ not all zero such that
> $$
> \alpha_1{\bf v_1} + \cdots + \alpha_{r+1}{\bf v_{r+1}} = {\bf 0}.
> $$
> Case 1: $\alpha_{r+1}=0$.
> Then $\alpha_1, \cdots, \alpha_{r}$ not all zero such that
> $$
> \alpha_1{\bf v_1} + \cdots + \alpha_{r}{\bf v_{r}} = {\bf 0},
> $$
> which is not possible since $\{{\bf v_1, v_2, \cdots, v_r}\}$ is linearly independent.
>
> Case 2: $\alpha_{r+1}\ne 0$.
> Then we have
> $$
> {\bf v_{r+1}} = \frac{-\alpha_1}{\alpha_{r+1}}{\bf v_1} + \cdots + \frac{-\alpha_r}{\alpha_{r+1}}{\bf v_{r}},
> $$
> that is, ${\bf v_{r+1}}\in\text{span}\{{\bf v_1, \cdots, v_r}\}$ which contradicts (1).
>
> So, it is not possible to have $\{{\bf v_1, v_2, \cdots, v_r, v_{r+1}}\}$ to be linearly dependent. It must be a linearly independent set.