---
tags: Linear Algebra
---
{%hackmd hackmd-dark-theme %}
# Linear Algebra Note 2
## Ch. 3 Vector Spaces
### Sec. 3.1 Euclidean Vector Spaces
$V$ is a set on which _"addition"_ & _"scalar multiplication"_ are ==defined==, i.e., the **closure properties** must be satisfied.
#### Closure Properties
1. If $\mathbf{x}\in V$ and $\alpha$ is a scalar, then $\alpha\mathbf{x}\in V$.
2. If $\mathbf{x},\mathbf{y}\in V$, then $\mathbf{x}+\mathbf{y}\in V$.
#### Axioms
If the eight axioms are satisfied, $V$ is a soi-disant **vector space**.
1. $\mathbf{x}+\mathbf{y}=\mathbf{y}+\mathbf{x}\forall\mathbf{x},\mathbf{y}\in V$ _(communicative law)_
2. $(\mathbf{x}+\mathbf{y})+\mathbf{z}=\mathbf{x}+(\mathbf{y}+\mathbf{z})\forall\mathbf{x},\mathbf{y},\mathbf{z}\in V$ _(associative law for vectors)_
3. $\exists\mathbf{0}\text{ s.t. }\mathbf{x}+\mathbf{0}=\mathbf{x}\forall\mathbf{x}\in V$ _(additive identity)_
4. $\exists\mathbf{-x}\text{ s.t. }\mathbf{x}+\mathbf{-x}=\mathbf{0}\forall\mathbf{x}\in V$ _(additive inverse)_
5. $\alpha(\mathbf{x}+\mathbf{y})=\alpha\mathbf{x}+\alpha\mathbf{y}\forall\alpha\in\mathbb{R}\forall\mathbf{x},\mathbf{y}\in V$ _(distributive law for vectors)_
6. $(\alpha+\beta)\mathbf{x}=\alpha\mathbf{x}+\beta\mathbf{x}\forall\alpha,\beta\in\mathbb{R}\forall\mathbf{x}\in V$ _(distributive law for scalars)_
7. $(\alpha\beta)\mathbf{x}=\alpha(\beta\mathbf{x})\forall\alpha,\beta\in\mathbb{R}\forall\mathbf{x}\in V$ _(associative law for scalars)_
8. $1\mathbf{x}=\mathbf{x}\forall\mathbf{x}\in V$ _(scalar indentity)_
#### Theorem 3.1.1
If $V$ is a vector space and $\mathbf{x}\in V$, then:
- $0\mathbf{x}=\mathbf{0}$
- $\mathbf{x}+\mathbf{y}=\mathbf{0}\implies\mathbf{y}=-\mathbf{x}$ _(uniqueness of additive inverse)_
- $-1\mathbf{x}=\mathbf{-x}$
### Sec 3.2 Subspaces
If $S$ is a non-empty subset of a vector space $V$ and holds the two **closure properties**, then $S$ is a soi-disant **subspace** of $V$.
- $S$ is said to be _closed under the operation_ of $V$.
- Each subspace is also a vector space _in its own right_.
- All subspaces except $\{\mathbf{0}\}$ (which is called _zero subspace_) and itself are called _proper subspaces_.
#### Null Spaces
Let $A$ be a $m$-by-$n$ matrix. The _**null space**_ of $A$, $N(A)$, denotes the set of all solutions to the _homogenious system_ $A\mathbf{x}=\mathbf{0}$.
$N(A)$ is a subspace of $\mathbb{R^n}$, i.e., or since, $A\mathbf{x}=\mathbf{0}\iff A\alpha\mathbf{x}=\mathbf{0}\land A(\mathbf{x}+\mathbf{y})=\mathbf{0}\iff A\mathbf{x}+A\mathbf{y}=\mathbf{0}$
#### Span
Let $\mathbf{v}=\{\mathbf{v_1},\mathbf{v_2},\dots,\mathbf{v_n}\}$ be vectors in vector spcace $V$. The set of all **linear combinations** of $\mathbf{v}$ ,$\sum\alpha_i\mathbf{v_i}$, where $\alpha$s are scalars, $\operatorname{Span}(\mathbf{v})$, is called the **span** of $\mathbf{v}$.
#### Theorem 3.2.1
If $\mathbf{v}=\{\mathbf{v_1},\mathbf{v_2},\dots,\mathbf{v_n}\}\subset V$, then $\operatorname{Span}(\mathbf{v})$ is a subspace of $V$.
#### Spanning Sets
Let $\mathbf{v}=\{\mathbf{v_1},\mathbf{v_2},\dots,\mathbf{v_n}\}\subset V$. If $\operatorname{Span}(\mathbf{v})=V$, then $V$ is spanned by $\mathbf{v}$ and $\mathbf{v}$ is a ==spanning set== for $V$.
#### Theorem 3.2.2
If $A\mathbf{x}=\mathbf{b}$ is consistent and $\mathbf{x_0}$ is a _particular solution_ to the system, then $\mathbf{y}=\mathbf{x_0}+\mathbf{z}$ would also be a solution if and only if $\mathbf{z}\in N(A)$.
### Sec 3.3 Linear Indep.
:::success
- If $\mathbf{v}$ span a vector space $V$ and one of the vector could be represented by ==linear combination== of the remaining vectors, then the remaining vectors span $V$.
- One of the vector could be represented by linear combination of the remaining vectors if and only if $\exists c_i$ _==not all==_ $0$s such that $\sum c_i\mathbf{v_i}=\mathbf{0}$.
:::
$\mathbf{v}$ are **linear indep.** if $\sum c_i\mathbf{v_i}=\mathbf{0}\implies c_1=c_2=\dots=c_n=0$.
#### Basis
$\mathbf{v}$ form a basis, _minimum spanning set_, of $V$ if and only if $\mathbf{v}$ are linear indep. and $\mathbf{v}$ span $V$.
#### Theorem 3.3.1
Let $\mathbf{x_1},\mathbf{x_2},\dots,\mathbf{x_n}$ be vectors in $\mathbf{R^n}$ and $X=(\mathbf{x_1},\mathbf{x_2},\dots,\mathbf{x_n})$. $\mathbf{x_1},\mathbf{x_2},\dots,\mathbf{x_n}$ are _linear dep._ if and only if $X$ is _singular_, i.e. $\det(X)=0$.
#### Theorem 3.3.2
Linear combinations of _linear indep._ vectors are _unique_.
### Sec. 3.4 Basis & Dimension
$\mathbf{v}$ form a basis, _minimum spanning set_, of $V$ if and only if $\mathbf{v}$ are linear indep. and $\mathbf{v}$ span $V$.
#### Theorem 3.4.1
If a vector space $V$ has a _spanning set_ of size $n$, then any $m$ vectors belong to $V$, where $m>n$, are _linear dep._
#### Corollary 3.4.2
Any basis of the same vector space should be of the same size.
#### Dimension
The dimension of a vector space, $\dim(V)$, is the size of its basis.
#### Theorem 3.4.3
If $\dim(V)=n>0$, then $n$ vectors span $V$ if and only if they're linear indep.
#### Theorem 3.4.4
If $\dim(V)=n>0$, then:
1. No set of less than $n$ vectors could span $V$.
2. Any set of less than $n$ _linear indep._ vectors could be extended to form a basis for $V$.
3. Any _spanning_ set of more than $n$ vectors could be trimmed off to a basis for $V$.
### Sec. 3.5 Change of Basis
The standard basis of $\mathbb{R^2}$ is $\{\mathbf{e_1},\mathbf{e_2}\}$. $\forall\mathbf{x}\in\mathbb{R^2},\mathbf{x}=(x_1,x_2)^T=x_1\mathbf{e_1}+x_2\mathbf{e_2}$. The scalars $x1,x2$ are the ==coordinates== of $\mathbf{x}$ (w.r.t. the standard basis).
For another basis $\{\mathbf{y},\mathbf{z}\}$ of $\mathbb{R^2}$, $\mathbf{x}=\alpha\mathbf{y}+\beta\mathbf{z}$. The scalars $\alpha,\beta$ are the ==coordinates== of $\mathbf{x}$ w.r.t. the ==ordered basis== $[\mathbf{y},\mathbf{z}]$.
### Transmission Matrix
Let $\mathbf{u_1},\mathbf{u_2}$ be the new basis. $U=(\mathbf{u_1},\mathbf{u_2})$ is the **transmission matrix** from the ordered basis $[\mathbf{u_1},\mathbf{u_2}]$ to standard basis.\\[\mathbf{x}=U\mathbf{c}\\] $U$ must be nonsingular. \\[\mathbf{c}=U^{-1}\mathbf{x}\\]
Assume that the coordinates of $\mathbf{x}$ w.r.t. $\{\mathbf{u_1},\mathbf{u_2}\}$ $\mathbf{c}$ are known. For its coordinates $\mathbf{d}$ to the other basis $\{\mathbf{v_1},\mathbf{v_2}\}$, we have $U\mathbf{c}=V\mathbf{d}\iff\mathbf{d}=V^{-1}U\mathbf{c}$, where $U=(\mathbf{u_1},\mathbf{u_2}),V=(\mathbf{v_1},\mathbf{v_2})$.
:::warning
\\[[\mathbf{x}]_V\\]
: The coordinates of $\mathbf{x}$ w.r.t. $V$.
:::
## Sec. 3.6 Row Space & Column Space
The span of row vectors of a matrix $A$ is called its row space $R(A)$, while the span of column ones is called its column space $C(A)$.
#### Theorem 3.6.1
Any row-equivalent matrices share the same row space.
#### Rank
The ==rank== of a matrix $A$, $\operatorname{rank}(A)$, is defined as the dimension of its row space $\dim(R(A))$.
:::info
- To determine the rank, we could reduce the matrix to _row echlon form_. The ==non-zero== rows would form a basis for the row space.
:::
#### Theorem 3.6.2 Consistency Theorem for Linear System
$A\mathbf{x}=\mathbf{b}$ is _consistent_ if and only if $\mathbf{b}\in C(A)$.
#### Theorem 3.6.3
$\forall\mathbf{b}\in\mathbb{R^n}A\mathbf{x}=\mathbf{b}$ is _consistent_ if and only if $C(A)=\mathbb{R^n}$.
$\forall\mathbf{b}\in\mathbb{R^n}A\mathbf{x}=\mathbf{b}$ has _at most a solution_ if and only if column vectors of $A$ are _linear indep._
#### Corolary 3.6.4
A $n\times n$ matrix is _non-singular_ if and only if its column vectors form a basis for $\mathbb{R^n}$.
#### Nullity
The dimension of the null space of a matrix $A$ is called its ==nullity==.
#### Theorem 3.6.5 The Rank-Nullity Theorem
If $A$ is a $m\times n$ matrix, then its rank pus its nullity equals $n$.
#### Theorem 3.6.6
The dimension of row space of a matrix equals the one of its column space.
## Ch. 4 Linear Transformations
### Sec 4.1 Definition & Examples
A mapping $L:V\mapsto W$ where $V,W$ are vector space is a ==linear transformation== if $L(\alpha\mathbf{v_1}+\beta\mathbf{v_2})=\alpha L(\mathbf{v_1})+\beta L(\mathbf{v_2})\forall\mathbf{v_1},\mathbf{v_2}\in V\forall\alpha,\beta\in\mathbb{R}$.
If $W=V$, $L$ is a ==linear operator== on $V$.
1. $L(\mathbf{0_V})=\mathbf{0_W}$
2. $L(\sum\alpha_i\mathbf{v_i})=\sum\alpha_iL(\mathbf{v_i})$
3. $L(-\mathbf{v})=-L(\mathbf{v})$
#### Identity Operator
\\[\mathcal{I}(\mathbf{v})=\mathbf{v},\forall\mathbf{v}\in V\\], where $V$ is a vector space.
#### Kernel
\\[\ker(L)=\{\mathbf{v}\in V|L(\mathbf{v})=\mathbf{0_W}\}\\]
> Note the null space of $A$, $N(A)$.
#### Image & Range
\\[L(S)=\{\mathbf{w}\in W|\exists\mathbf{v}\in V,L(\mathbf{v})=\mathbf{w}\}\\]
> What the image to a transformation is like what the codomain to a function.
$L(V)$ indicates the range of $L$.
#### Theorem 4.1.1
1. $\ker(V)$ is a subspace of $V$.
2. $L(S)$ is a subspace of $W$.
### Sec 4.2 Matrix Representations of Linear Transformations
If $A$ is a $m\times n$ matrix, $L_A(\mathbf{x})=A\mathbf{x}$ is a linear transformation $\mathbb{R^n}\mapsto\mathbb{R^m}$, vice versa.
#### Theorem 4.2.1
For any linear transformation, there exists a matrix such that $L(\mathbf{x})=A\mathbf{x}$, where $\mathbf{a_j}=L(\mathbf{e_j})$.
#### Theorem 4.2.2 Mstrix Representation Theorem
If $E=\{\mathbf{v_1},\dots,\mathbf{v_n}\},F=\{\mathbf{w_1},\dots,\mathbf{w_m}\}$ are ordered bases of vector spaces $V,W$ resp., then for any linear transformation $L:V\mapsto W$, there exists a $m\times n$ matrix $A$ such that $\forall\mathbf{v}\in V,[L(\mathbf{v})]_F=A[\mathbf{v}]_E$, where $\mathbf{a_j}=L(\mathbf{v_j})$.
#### Theorem 4.2.3
Let $E=\{\mathbf{v_1},\dots,\mathbf{v_n}\},F=\{\mathbf{w_1},\dots,\mathbf{w_m}\}$ be ordered bases of $\mathbb{R^n},\mathbb{R^m}$ resp. Then for any linear transformation $L:\mathbb{R^n}\mapsto\mathbb{R^m}$, the matrix representing it is $(\mathbf{a_j})$, where $\mathbf{a_j}=B^{-1}L(\mathbf{v_j})$.
### Sec. 4.3 Similarity
If:
1. $B$ is the matrix representing $L$ w.r.t. $\{\mathbf{u_1},\mathbf{u_2}\}$.
2. $A$ is the matrix representing $L$ w.r.t. $\{\mathbf{e_1},\mathbf{e_2}\}$.
3. $U$ is the transition matrix from $\{\mathbf{u_1},\mathbf{u_2}\}$ to $\{\mathbf{e_1},\mathbf{e_2}\}$.
Then we have $B=U^{-1}AU$.
#### Theorem 4.3.1
Let $E=\{\mathbf{v_1},\dots,\mathbf{v_n}\},F=\{\mathbf{w_1},\dots,\mathbf{w_n}\}$ be two ordered bases of vector space $V$, $L$ be a linear operator on $V$ and $S$ be the transition matrix from $F$ to $E$.
If $A$ is the matrix representing $L$ w.r.t. $E$ and $B$ is the one to $F$, then we have $B=S^{-1}AS$.
#### Similar
Let $A,B$ be two $n$-square matrices. If $\exists S,\det(S)\neq0$ such that $B=S^{-1}AS$, $B$ is said to be ==similar== to $A$.
{%hackmd @nevikw39/signature %}