---
tags: Giang's linear algebra notes
---
# Chapter 1: Vector space
[toc]
## Scalars
**Scalar**: A scalar is any quantity which can be described by a single number
**Other definitions of scalar**:
* Francois Viete’s “Analytic Art” definition of scalars:
>*Magnitudes which ascend or descend proportionally, in keeping with their nature from one kind to another*
* W. R. Hamilton’s 1846 definition of scalars:
>*The algebraically reall part may receive, according to the question in which it occurs, all values contained on the one scale of progression of numbers, from negative to positive infinity*
**The field of scalars $\textbf{F}$** is the set of all scalars. This is usually denoted by $\textbf{R}$, the field of real numbers
**Basic operations with scalars** are *addition*, *subtraction*, and *multiplication*
## Vector space
**Vector**: A vector is any member of a vector space
>**NOTE**: A more popular definition of vector is any quantity with both direction and magnitude. However, it is misleading or false for certain kinds of vectors
**Vector space**: A vector space is any class of objects, which can be added together, or multiplied with scalars. Formally, a vector space is any collection $V$ of objects, called *vectors*, for which
1. Two operations can be performed, which are
- Vector addition, i.e. take $u,v\in V$ and return $v+u\in V$ (closed under addition)
- Scalar multiplication, i.e. take a scalar $c\in\textbf{R}$ and a vector $v\in V$, and return $cv\in V$ (closed under scalar mutiplication)
2. The following properties are satisfied
- Addition is commutative, i.e. $v+w=w+v$, $\forall v,w\in V$
- Addition is associative, i.e. $u+(v+w)=(u+v)+w$, $\forall u,v,w\in V$
- Additive identity, i.e. $\exists\textbf{0}\in V\,\text{s.t.}\,0+v=v$,$\forall v\in V$
- Additive inverse, i.e. $\forall v\in V,\exists-v\in V\,\text{s.t.}\,-v+v=\textbf{0}$
- Multiplicative identity, i.e. the scalar $1$ has the property $1\cdot v=v,\forall v\in V$
- Multiplicative is associative, i.e. $a(bv)=(ab)v,\forall a,b\in\textbf{R},\forall v\in V$
- Multiplication is linear, i.e. $a(v+w)=av+aw,\forall a\in\textbf{R},\forall v,w\in V$
- Multiplication distributes over addition, i.e. $(a+b)v=av+bv,\forall a,b\in\textbf{R},v\in V$
Briefly, the summary of the conditions for a vector space is that the
laws of algebra work
>**NOTE**: In most cases, we do not need to verify all of the axioms
>**NOTE**: The zero vector is denoted $\textbf{0}$, but technically, it is not the same as the zero scalar $0$. However, in practice, there is no harm in confusing the two objects
>**NOTE**: There are other things, like dot products, lengths, etc. we can do in $\textbf{R}^{n}$ but not common to all vector spaces
**Examples of vector spaces**
- Scalars as vectors, i.e. $\textbf{R}$ can be thought of as a boring vector space
- Zero vector space, i.e. $\textbf{R}^{0}=\{0\}$
- Polynomial as vectors, i.e. $P_{n}(\textbf{R})=\{\sum_{i=0}^{n}a_{i}x_{i}:a_{i}\in\textbf{R}\}$
- Functions as vectors, i.e. $C(\textbf{R})$ is the vector space of all continuous functions of $x$
- Functions as vectors, i.e. ${\cal F}(S,\textbf{R})=\{f:f:S\to\textbf{R}\}$
- Sequences as vectors, i.e. $\textbf{R}^{\infty}$ is the vector space of all infinite sequences, with
- Addition: $(a_{1},a_{2},\dots)+(b_{1},b_{2},\dots)=(a_{1}+b_{1},a_{2}+b_{2},\dots)$
- Scalar multiplication: $c\cdot(a_{1},a_{2},\dots)=(ca_{1},ca_{2},\dots)$
**Examples of non-vector spaces**
- Unit vectors, i.e. not closed under addition, or under scalar multiplication
- Positive real axis $\textbf{R}^{+}$, i.e. not closed under negative scalar multiplication
- Latitude-longtitude of a place, i.e. there is no reasonable way to define addition and scalar multiplication
**Vector subspaces**. A vector space $W$ is a subspace of a vector space
$V$ if
- $W\subseteq V$
- The laws of vector addition and scalar multiplication are consistent
**Subspace from space**: If $V$ is a vector space and $W\subseteq V$, then $W$ is a subspace of $V$ if and only if
- $W$ is closed under addition
- $W$ is closed under scalar multiplication
**Improper subspaces of $V$**
1. Every vector space $V$ is considered a subspace of itself
2. $\{0\}$ is a subspace of every vector space
**Proper subspaces of $V$**: Any subspace $W$ of $V$, which is not $V$ or $\{0\}$, is a proper subspace of $V$
* Proper subspace and proper subset: Definition about proper subspaces is similar to the definition of proper subset
>**NOTE**: Intersection of two subspaces is a subspace, but union of two subspaces is usually not a subspace
## Linear combinations and span
**Linear combination of two vectors**: Let $v,w\in V$ are two vectors. Then, for any $a,b\in\textbf{R}$
$$av+bw$$
is called *a linear combination of $v$ and $w$. The span of
$v$ and $w$* is defined as
$$\text{span}(v,w)=\{av+bw:a,b,\in\textbf{R}\}$$
**Linear combination**: Let $S=\{v_{1},\dots,v_{n}\}\subset V$ is a set of vectors from a vector space $V$. Then, for any $a_{i}\in\textbf{R}$
$$\sum_{i=1}^{n}a_{i}v_{i}$$
is called *a linear combination of S. The span of $S$* is defined as
$$\text{span}(S)=\{\sum_{i=1}^{n}a_{i}v_{i}:a_{i}\in\textbf{R}\}$$
In case $S=\emptyset$, $\text{span}(S)=\{\textbf{0}\}$, i.e. the result of an empty combination is zero
**Properties of $\text{span}(S)$**
1. $\text{span}(S)$ is a subspace of $V$, which contains $S$ as a subset
2. Any subspace of $V$ which contains $S$ as a subset must be a superset of $\text{span}(S)$
*Proof*
Let $S=\{v_{1},\dots,v_{n}\}$, we have that $\text{span}(S)\subseteq V$ and $\text{span}(S)$ is closed under addition and multiplication, thus it is a subspace of $V$, which contains $S$ as a subset. Now let $W$ be a subspace of $V$ which contains $S$ as a subset. By the definition of subspaces, we have that
$$\forall\{a_{i}\}_{i=1}^{n},a_{i}\in\textbf{R},\sum_{i=1}^{n}a_{i}v_{i}\in W$$
Thus $\text{span}(S)=\{\sum_{i=1}^{n}a_{i}v_{i}:a_{i}\in\textbf{R}\}\subseteq W$
- [ ]
**Spanning set**: $S$ is to span a vector space $V$ if $\text{span}(S)=V$. We call $S$ *a spanning set of* $V$, or *the a generating set of* $V$, or $V$ *is generated by* $S$
>**NOTE**: Our interest now is, however, find the minimal spanning set $S$ of $V$, i.e. all vectors in $S$ must be linearly independent
## Linear dependence and independence
**Linear dependence**: Let $S=\{v_{1},\dots,v_{n}\}\subset V$ be a set of
vectors where $V$ is some vector space. Then we say $S$ is *linearly dependent* if
$$\exists\{a_{i}\}_{i=1}^{n},\sum_{i=1}^{n}a_{i}v_{i}=0\land\exists i\in\{1,\dots,n\},a_{i}\neq0$$
and we say $S$ is *linearly independent* if it is not linearly dependent
>**NOTE**: $\emptyset$ is always linearly independent
>**NOTE**: $\{\textbf{0}\}$ is always linearly dependent
**Theorem**: Let $S=\{v_{1},\dots,v_{n}\}$ be a subset of a vector space $V$. Then if $S$ is linearly dependent then
$$\exists v\in S,\text{span}(S-\{v\})=\text{span}(S)$$
and if $S$ is linearly independent then
$$\forall S'\subset S,\text{span}(S')\subset\text{span}(S)$$
*Proof*
First, we prove the first proposition. By the definition of linear dependencies,
$$\exists\{a_{i}\}_{i=1}^{n},\sum_{i=1}^{n}a_{i}v_{i}=0\land\exists i\in\{1,\dots,n\},a_{i}\neq0$$
Without loss of generality, we assume that $a_{1}\neq0$. Thus
$$v_{1}=\sum_{i=2}^{n}\frac{a_{i}}{a_{1}}v_{i}=\sum_{i=2}^{n}b_{i}v_{i}$$
Now we have that
$$\sum_{i=1}^{n}c_{i}v_{i}=\sum_{i=2}^{n}(c_{i}+b_{i})v_{i}\in\text{span}(S-\{v_{1}\})$$
In other words,
$$v\in\text{span}(S)\implies v\in\text{span}(S-\{v_{1}\})$$
Thus
$$\text{span}(S)\subseteq\text{span}(S-\{v_{1}\})$$
It easily to show that
$$\text{span}(S-\{v_{1}\})\subseteq\text{span}(S)$$
Thus we conclude that $\text{span}(S)=\text{span}(S-\{v_{1}\})$. We now prove the second proposition. This proposition can be proven easily by contradiction. Assume that
$$\exists S'\subset S,\text{span}(S')=\text{span}(S)$$
and, without generality, suppose that $S'=S-\{v_{1}\}$. We have that
$$\sum_{i=1}^{n}a_{i}v_{i}=\sum_{i=2}^{n}b_{i}v_{i}$$
where $\sum_{i=1}^{n}a_{i}v_{i}\in S$, $a_{1}\neq0$ and $\sum_{i=2}^{n}b_{i}v_{i}\in S'$. This is due to $\text{span}(S)=\text{span}(S')$. Thus
$$a_{1}v_{1}+\sum_{i=2}^{n}(a_{i}-b_{i})v_{i}=0$$
which implies that $S=\{v_{1},\dots,v_{n}\}$ is linearly dependent. This is a contradiction
- [ ]
## Expansions
**Complex numbers and rational numbers notation**: The set of complex numbers are denoted by $\textbf{C}$ and the set of rational numbers are denoted by $\textbf{Q}$
**Infinite dimensional vector space** is a vector space with infinitely
many independent vectors
**Summation of sets of vectors**: If $S_{1}$ and $S_{2}$ are nonempty subsets of a vector space $V$, then *the sum of $S_{1}$ and $S_{2}$* is
$$S_{1}+S_{2}=\{x+y:x\in S_{1}\land y\in S_{2}\}$$