---
title: Linear Algebra Note 2
tags: Linear Algebra, 線性代數, 魏群樹, 大學, 國立陽明交通大學, 筆記
---
# Linear Algebra Note 2
## 3. Vector Spaces and Subspaces (2021.10.13 ~ 2021.10.22)
- Define: $\mathbb{R}^n = \{(a_1,\ a_2,\ \cdots,\ a_n) \mid a_i \in \mathbb{R}\}$
(The space $\mathbb{R}^n$ consists of all vectors $\vec{V}$ with $n$ components)
- Example
$\left[\begin{array}{c c}
1 & 2 \\
\end{array}\right] \in \mathbb{R}^2,\ \left[\begin{array}{c}
1 \\
2 \\
3 \\
\end{array}\right] \in \mathbb{R}^3,\ \left[\begin{array}{c}
2 \\
\pi \\
\end{array}\right] \in \mathbb{R}^2$
- vector operations
- vector addition: $\vec{v} + \vec{w}$ (element wise addition)
- Example
$\left[\begin{array}{c}
1 \\
2 \\
3 \\
\end{array}\right] + \left[\begin{array}{c c}
2 \\
\pi \\
0 \\
\end{array}\right] = \left[\begin{array}{c c}
1+2 \\
2+\pi \\
3+0 \\
\end{array}\right] = \left[\begin{array}{c c}
3 \\
2+\pi \\
3 \\
\end{array}\right]$
- scalar multiplication: $c\vec{v}$ where $c \in \mathbb{R}$
- Example
$c = 3,\ \vec{v} = \left[\begin{array}{c c}
1 & 2 \\
\end{array}\right] \\
c\vec{v} = 3\left[\begin{array}{c c}
1 & 2 \\
\end{array}\right] = \left[\begin{array}{c c}
3 & 6 \\
\end{array}\right]$
### Vector Space
- Define: A vector space $\mathbb{V}$ is a collection of vectors with two operations: (1) vector addition (2) scalar multiplication, so that
(1) $\forall \vec{v},\ \vec{w} \in \mathbb{V},\ \vec{v}+\vec{w} \in \mathbb{V}$
(2) $\forall c \in \mathbb{R},\ c\vec{v} \in \mathbb{V}$
> Any linear combination of vectors in $\mathbb{R}^n$ results in a vector in $\mathbb{R}^n$
### Subspace
- Define: A subset $\mathbb{W}$ of a vector space $\mathbb{V}$ is called a subspace if $\mathbb{W}$ itself form a vector space with the two operations defined on $\mathbb{V}$ and contains $\vec{O}$
> Since $\mathbb{V}$ has been known as a vector space, so $\mathbb{W}$ should satisty the 8 axiom of vector as well $\implies$ verify the closedness of $\mathbb{W}$
- Theorem
Let $\mathbb{V}$ a vector space and $\mathbb{W}$ is a subspace of $\mathbb{V}$ if and only if
1. $\vec{O} \in \mathbb{W}$ (the same $\vec{O}$ in $\mathbb{V}$)
2. $\vec{x}+\vec{y} \in \mathbb{W}\ for\ \vec{x},\ \vec{y} \in \mathbb{W}$ (closed under addition)
3. $c\vec{x} \in \mathbb{W}\ for\ any\ c \in \mathbb{R},\ any\ \vec{x} \in \mathbb{W}$ (closed under multiplication)
- Example1
Let $\mathbb{V}$ be a vector space, $\mathbb{V}$ itself is a subspace of $\mathbb{V}$
1. $\vec{O} \in \mathbb{V}$
2. $\vec{x}+\vec{y} \in \mathbb{V}\ for\ \vec{x},\ \vec{y} \in \mathbb{V}$
3. $c\vec{x} \in \mathbb{V}\ for\ any\ c \in \mathbb{R},\ any\ \vec{x} \in \mathbb{V}$
- Example2
$\mathbb{Z} = \{\vec{O}\}$ is a subspace of $\mathbb{V}$
1. $\vec{O} \in \mathbb{Z}$
2. $\vec{O} + \vec{O} = \vec{O} \in \mathbb{Z}$
3. $c\vec{O} = \vec{O} \in \mathbb{Z}$
- Example3
$\mathbb{M}$: The matrix space of all real $2 \times 2$ matrices
$\mathbb{M} = \{ \left[\begin{array}{c c}
a & b \\
c & d \\
\end{array}\right] \mid a,\ b,\ c,\ d \in \mathbb{R} \} \\
Let\ \mathbb{D} = \{ \left[\begin{array}{c c}
e & 0 \\
0 & f \\
\end{array}\right] \mid e,\ f \in \mathbb{R} \}$
The collection of all $2 \times 2$ diagonal matrices $\mathbb{D} \subseteq \mathbb{M}$ is a subspace
1. $\left[\begin{array}{c c}
0 & 0 \\
0 & 0 \\
\end{array}\right] \in \mathbb{D}$
2. $\left[\begin{array}{c c}
e_1 & 0 \\
0 & f_1 \\
\end{array}\right] + \left[\begin{array}{c c}
e_2 & 0 \\
0 & f_2 \\
\end{array}\right] \in \mathbb{D}$
3. $\left[\begin{array}{c c}
ce_1 & 0 \\
0 & cf_1 \\
\end{array}\right] \in \mathbb{D}$
- Example4
$\mathbb{R}^3$ is a vector space, and $\mathbb{W}_1 = \{[a_1,\ a_2,\ a_3] \in \mathbb{R}^3 \mid a_1 = 3a_2,\ a_3 = a_2\} \\
\begin{split}\forall \vec{x},\ \vec{y} \in \mathbb{W_1},\ \vec{x} &= [x_1,\ x_2,\ x_3],\ x_1 = 3x_2,\ x_3 = x_2 \\
\vec{y} &= [y_1,\ y_2,\ y_3],\ y_1 = 3y_2,\ y_3 = y_2\end{split} \\
\text{1. } [0,\ 0,\ 0] \in \mathbb{W_1} \\
\begin{split}\text{2. } \vec{x}+\vec{y} &= [x_1+y_1,\ x_2+y_2,\ x_3+y_3] \\
&= [3x_2+3y_2,\ x_2+y_2,\ x_2+y_2] \in \mathbb{W_1}\end{split} \\
\text{3. } c\vec{x} = [cx_1,\ cx_2,\ cx_3] = [c(3x_2),\ cx_2,\ cx_2] \in \mathbb{W_1} \\
\implies \mathbb{W_1} \text{ is a subspace of } \mathbb{R}^3 \\
\mathbb{W_2} = \{[a_1,\ a_2,\ a_3] \in \mathbb{R}^3 \mid a_1 = a_3 + 2\} \\
[0,\ 0,\ 0] \notin \mathbb{W_2} \implies \mathbb{W_2} \text{ is not a subspace of } \mathbb{R}^3$
### Four Fundamental Subspaces
1. Column space of $A_{m \times n}$: $C(A) \triangleq \{A\vec{x} \mid \vec{x} \in \mathbb{R}^n\}$
2. Row space of $A_{m \times n}$: $C(A^T) \triangleq \{A^T\vec{y} \mid \vec{y} \in \mathbb{R}^m\}$
3. Null space of $A$: $N(A) \triangleq \{\vec{x} \in \mathbb{R}^n \mid A\vec{x} = \vec{O}\}$
4. Left Null space of $A$: $N(A^T) \triangleq \{\vec{y} \in \mathbb{R}^m \mid A^T\vec{y} = \vec{O}\}\ (A^T\vec{y} = \vec{O} \implies \vec{y}^TA = \vec{O})$
### Column Space
$C(A)$ (The column space of $A$) is defined by all linear combinations of the columns of $A$
$\text{Let }A = \left[\begin{array}{c c}
1 & 0 \\
4 & 3 \\
2 & 3 \\
\end{array}\right],\ C(A) = \{A\vec{x} \mid \vec{x} \in \mathbb{R}^2\} = \{x_1\left[\begin{array}{c c}
1 \\
4 \\
2 \\
\end{array}\right]+x_2\left[\begin{array}{c c}
0 \\
3 \\
3 \\
\end{array}\right] \mid x_1,\ x_2 \in \mathbb{R}\}$
Hence $C(A)$ is a collection of $\vec{b}$ for which $A\vec{x} = \vec{b}$ is solvable, i.e. $A\vec{x} = \vec{b}$ if and only if $\vec{b} \in C(A)$
- Example1
$A = I_{2 \times 2} = \left[\begin{array}{c c}
1 & 0 \\
0 & 1 \\
\end{array}\right] \\
C(A) = \{A\vec{x} \mid \vec{x} \in \mathbb{R}^2\} = \{x_1\left[\begin{array}{c c}
1 \\
0 \\
\end{array}\right]+x_2\left[\begin{array}{c c}
0 \\
1 \\
\end{array}\right] \mid x_1,\ x_2 \in \mathbb{R}\}$
- Example2
$A = \left[\begin{array}{c c}
1 & 2 \\
2 & 4 \\
\end{array}\right] \\
\begin{split}C(A) = \{A\vec{x} \mid \vec{x} \in \mathbb{R}^2\} = &\{x_1\left[\begin{array}{c c}
1 \\
2 \\
\end{array}\right]+x_2\left[\begin{array}{c c}
2 \\
4 \\
\end{array}\right] \mid x_1,\ x_2 \in \mathbb{R}\} \\
= &\{(x_1+2x_2)\left[\begin{array}{c c}
1 \\
2 \\
\end{array}\right] \mid x_1,\ x_2 \in \mathbb{R}\} \\
\text{Let }x_1+2x_2=y \implies &\{y\left[\begin{array}{c c}
1 \\
2 \\
\end{array}\right] \mid y \in \mathbb{R}\} \\
\implies &\text{The solution of }A\vec{x}=\vec{b} \text{ form a line}\end{split}$
- Example3
$A = \left[\begin{array}{c c c}
1 & 2 & 3 \\
0 & 0 & 4 \\
\end{array}\right]_{2 \times 3} \\
\begin{split}C(A) &= \{x_1\left[\begin{array}{c c}
1 \\
0 \\
\end{array}\right]+x_2\left[\begin{array}{c c}
2 \\
0 \\
\end{array}\right]+x_3\left[\begin{array}{c c}
3 \\
4 \\
\end{array}\right] \mid x_1,\ x_2,\ x_3 \in \mathbb{R}\} \\
&= \{x_1\left[\begin{array}{c c}
1 \\
0 \\
\end{array}\right]+2x_2\left[\begin{array}{c c}
1 \\
0 \\
\end{array}\right]+x_3\left[\begin{array}{c c}
3 \\
4 \\
\end{array}\right] \mid x_1,\ x_2,\ x_3 \in \mathbb{R}\} \\
&= \{y\left[\begin{array}{c c}
1 \\
0 \\
\end{array}\right]+x_3\left[\begin{array}{c c}
3 \\
4 \\
\end{array}\right] \mid y = x_1+2x_2,\ x_3 \in \mathbb{R}\}\end{split}$
- Theorem
$C(A)$ is a subspace of $\mathbb{R}^m$
- Proof
1. $A\vec{O}_{n \times 1} = \vec{O}_{m \times 1} \implies \vec{O}_{m \times 1} \in C(A),\ \vec{O}_{m \times 1} \in \mathbb{R}^m$
2. $\forall \vec{x},\ \vec{y} \in C(A),\ \text{we have } \vec{x}_{m \times 1} = A\vec{x'}_{n \times 1} \text{ and } \vec{y}_{m \times 1} = A\vec{y'}_{n \times 1} \text{ where } \vec{x'},\ \vec{y'} \in \mathbb{R}^n \\
\vec{x} + \vec{y} = A\vec{x'} + A\vec{y'} = A(\vec{x'}+\vec{y'}) = A\vec{z'} \text{ where } \vec{z'} = \vec{x'} + \vec{y'} \in \mathbb{R}^n \implies \vec{x} + \vec{y} \in C(A)$
3. $\forall c \in \mathbb{R} \\
c\vec{x} = cA\vec{x'} = A(c\vec{x'}) = A\vec{z''} \text{ where } \vec{z''} = c\vec{x'} \in \mathbb{R}^n \implies c\vec{x} \in C(A)$
### Null Space
- Define: The null space of $A_{m \times n}$ consists of all solutions of $A\vec{x} = \vec{O}$, denoted by $N(A)$ , i.e. $N(A) \triangleq \{\vec{x} \in \mathbb{R}^n \mid A\vec{x} = \vec{O}\}$
- Theorem
$N(A)$ is a subspace of $\mathbb{R}^n$
- Proof
1. $Let\ \vec{x} = \vec{O} \\
A\vec{O} = \vec{O} \implies \vec{O} \in N(A)$
2. $\forall \vec{x},\ \vec{y} \in N(A),\ A\vec{x} = \vec{O},\ A\vec{y} = \vec{O} \\
A(\vec{x} + \vec{y}) = A\vec{x} + A\vec{y} = \vec{O} + \vec{O} = \vec{O} \implies \vec{x} + \vec{y} \in N(A)$
3. $\forall c \in \mathbb{R} \\
A(c\vec{x}) = cA\vec{x} = \vec{O} \implies c\vec{x} \in N(A)$
- Example1
$A = \left[\begin{array}{c c}
1 & 2 \\
3 & 6 \\
\end{array}\right]$
Solve $A\vec{x} = \vec{O}$
$\begin{split}\left[\begin{array}{c|c}
A & \vec{O} \\
\end{array}\right] = &\left[\begin{array}{c c|c}
1 & 2 & 0 \\
3 & 6 & 0 \\
\end{array}\right] \\
\to &\left[\begin{array}{c c|c}
1 & 2 & 0 \\
0 & 0 & 0 \\
\end{array}\right] \\
\implies &\ x_1 + 2x_2 = 0 \\
\implies &\ x_1 = -2x_2 \\
\implies &N(A) = \{c\left[\begin{array}{c}
-2 \\
1 \\
\end{array}\right] \mid c \in \mathbb{R}\}\end{split}$
( $N(A)$ is a line consisting all solutions to $A\vec{x} = \vec{O}$ )
- Example2
$A = \left[\begin{array}{c c}
1 & 2 \\
3 & 8 \\
\end{array}\right]_{2 \times 2},\ B = \left[\begin{array}{c c}
1 & 2 \\
3 & 8 \\
2 & 4 \\
6 & 10 \\
\end{array}\right]_{4 \times 2},\ C = \left[\begin{array}{c c c c}
1 & 2 & 2 & 4 \\
3 & 8 & 6 & 16 \\
\end{array}\right]_{2 \times 4} \\
A = \left[\begin{array}{c c}
1 & 2 \\
3 & 8 \\
\end{array}\right] \to \left[\begin{array}{c c}
1 & 0 \\
0 & 1 \\
\end{array}\right] \implies \left.\begin{array}{c}
x_1 = 0 \\
x_2 = 0 \\
\end{array}\right. \\
\implies \text{The only solution to } A\vec{x} = \vec{O} \text{ is } \vec{x} = \vec{O} \text{ i.e. } N(A) = \{ \vec{O} \} \\
B = \left[\begin{array}{c c}
1 & 2 \\
3 & 8 \\
2 & 4 \\
6 & 10 \\
\end{array}\right] \to \left[\begin{array}{c c}
1 & 0 \\
0 & 1 \\
0 & 0 \\
0 & 0 \\
\end{array}\right] \implies \left.\begin{array}{c}
x_1 = 0 \\
x_2 = 0 \\
\end{array}\right. \\
\implies \text{Again, the only solution to } B\vec{x} = \vec{O} \text{ is } \vec{x} = \vec{O} \text{ i.e. } N(B) = \{ \vec{O} \} \\
C = \left[\begin{array}{c c c c}
1 & 2 & 2 & 4 \\
3 & 8 & 6 & 16 \\
\end{array}\right] \to \left[\begin{array}{c c c c}
\color{red}{1} & 0 & 2 & 0 \\
0 & \color{red}{1} & 0 & 2 \\
\end{array}\right] \implies \left.\begin{array}{c}
x_1 + 2x_3 = 0 \\
x_2 + 2x_4 = 0 \\
\end{array}\right. \implies \left.\begin{array}{c}
x_1 = -2x_3 \\
x_2 = -2x_4 \\
\end{array}\right. \\
\begin{split}\implies N(C) &= \{\left[\begin{array}{c}
x_1 \\
x_2 \\
x_3 \\
x_4 \\
\end{array}\right] \mid x_1 = -2x_3,\ x_2 = -2x_4,\ x_1,x_2,x_3,x_4 \in \mathbb{R}\} \\
&= \{\left[\begin{array}{c}
-2x_3 \\
-2x_4 \\
x_3 \\
x_4 \\
\end{array}\right] \mid x_3,x_4 \in \mathbb{R}\} \\
&= \{x_3\left[\begin{array}{c}
-2 \\
0 \\
1 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
0 \\
-2 \\
0 \\
1 \\
\end{array}\right] \mid x_3,x_4 \in \mathbb{R}\}\end{split}$
i.e. $N(C)$ has all linear combination of $\left[\begin{array}{c}
-2 \\
0 \\
1 \\
0 \\
\end{array}\right],\ \left[\begin{array}{c}
0 \\
-2 \\
0 \\
1 \\
\end{array}\right]$
> Since $N(C)$ is a subspace, any linear combination of special solutions ($\left[\begin{array}{c}
-2 \\
0 \\
1 \\
0 \\
\end{array}\right],\ \left[\begin{array}{c}
0 \\
-2 \\
0 \\
1 \\
\end{array}\right]$) lies in $N(C)$
> Variables corresponding to pivot columns → pivot variables, ex. $x_1$, $x_2$
> Variables corresponding to free columns → free variables, ex. $x_3$, $x_4$
- Example3
$A = \left[\begin{array}{c c c c c}
1 & 3 & 0 & 2 & -1 \\
0 & 0 & 1 & 4 & -3 \\
1 & 3 & 1 & 6 & -4 \\
\end{array}\right] \to R = \left[\begin{array}{c c c c c}
1 & 3 & 0 & 2 & -1 \\
0 & 0 & 1 & 4 & -3 \\
0 & 0 & 0 & 0 & 0 \\
\end{array}\right] \implies \left.\begin{array}{c}
x_1 + 3x_2 + 2x_4 - x_5 = 0 \\
x_3 + 4x_4 - 3x_5 = 0 \\
\end{array}\right. \\
\begin{split}\implies N(A) = N(R) &= \{\left[\begin{array}{c}
x_1 \\
x_2 \\
x_3 \\
x_4 \\
x_5 \\
\end{array}\right] \mid \left.\begin{array}{c}
x_1 + 3x_2 + 2x_4 - x_5 = 0 \\
x_3 + 4x_4 - 3x_5 = 0 \\
\end{array}\right.,\ x_1,x_2,x_3,x_4,x_5 \in \mathbb{R}\} \\
&= \{\left[\begin{array}{c}
-3x_2 - 2x_4 + x_5 \\
x_2 \\
-4x_4 + 3x_5 \\
x_4 \\
x_5 \\
\end{array}\right] \mid x_2,x_4,x_5 \in \mathbb{R}\} \\
&= \{x_2\left[\begin{array}{c}
-3 \\
1 \\
0 \\
0 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
-2 \\
0 \\
-4 \\
1 \\
0 \\
\end{array}\right] + x_5\left[\begin{array}{c}
1 \\
0 \\
3 \\
0 \\
1 \\
\end{array}\right] \mid x_2,x_4,x_5 \in \mathbb{R}\}\end{split}$
Let $N = \left[\begin{array}{c c c}
-3 & -2 & 1 \\
1 & 0 & 0 \\
0 & -4 & 3 \\
0 & 1 & 0 \\
0 & 0 & 1 \\
\end{array}\right]$, $span(N) =$ linear combination of the column vectors in $N = N(A) = N(R)$
Rearrange $N$ (null space matrix) $\to N' = \left[\begin{array}{c c c}
-3 & -2 & 1 \\
0 & -4 & 3 \\
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1 \\
\end{array}\right] = \left[\begin{array}{c}
-F_{2 \times 3} \\
I_{3 \times 3} \\
\end{array}\right] \left.\begin{array}{c}
r \text{ pivot variables} \\
n-r \text{ free variables} \\
\end{array}\right.$
Rearrange $R \to R' = \left[\begin{array}{c c c c c}
1 & 0 & 3 & 2 & -1 \\
0 & 1 & 0 & 4 & -3 \\
0 & 0 & 0 & 0 & 0 \\
\end{array}\right] = \left[\begin{array}{c c}
I_{2 \times 2} & F_{2 \times 3} \\
O_{1 \times 2} & O_{2 \times 3} \\
\end{array}\right] \left.\begin{array}{c}
r \text{ pivot rows} \\
m-r \text{ zero rows} \\
\end{array}\right.$
Check $R'N' = \left[\begin{array}{c c}
I_{2 \times 2} & F_{2 \times 3} \\
O_{1 \times 2} & O_{2 \times 3} \\
\end{array}\right]\left[\begin{array}{c}
-F_{2 \times 3} \\
I_{3 \times 3} \\
\end{array}\right] = \left[\begin{array}{c}
-F_{2 \times 3}+F_{2 \times 3} \\
O_{1 \times 3}+O_{1 \times 3} \\
\end{array}\right] = O_{3 \times 3}$
- A general approach to find $N(A)$
1. Identify special solutions
2. span these special solutions to get $N(A)$
- Summary: To find $N(A)$ of $A$
1. Use Gauss-Jordan Elimination $A \to R$ (reduced row echelon form)
2. Identify pivot variables and free variables
3. Identify special solutions (the number of special solutions is equal to the number of free variables)
4. $span(\{ special\ solutions \}) = N(A)$
Note that in step 2, if there is no free variables $\implies N(A) = \{ \vec{O} \}$
- Define: The dimension of a null space is the number of gree variables,
i.e. $dim(N(A)) = \text{the number of special solutions} = \text{the number of free variables}$
- Define: The rank of $A_{m \times n}$ , $r = rank(A) = \text{the number of pivots}$
### Span
- Define: We defined the span of $\mathbb{U}$ to be all linear combination of vectors in $\mathbb{U}$ , denoted by $span(\mathbb{U})$
- $span(\mathbb{U})$ is always a subspace of $\mathbb{V}(\mathbb{R}^m)$
- Example
$\mathbb{U} = \{\vec{u_1},\ \vec{u_2},\ \cdots,\ \vec{u_n} \} \subseteq \mathbb{V}$, i.e. $\vec{u_i} \in \mathbb{V}$ as well
$span(\mathbb{U}) = \{c_1\vec{u_1}+c_2\vec{u_2}+\cdots+c_n\vec{u_n} \mid c_1,\ c_2,\ \cdots,\ c_n \in \mathbb{R} \}$
- $C(A) = span(\mathbb{U})$ where $\mathbb{U} = \{ \text{column vectors of } A\}$
### The Complete Solution to $A\vec{x} = \vec{b}$
- Define: For $A\vec{x} = \vec{b}$, when $\left\{\begin{array}{c}
\vec{b} = \vec{0}\ \ \ \ \ \ \ \ \text{ Homogeneous linear equation} \\
\vec{b} \not= \vec{0} \text{ Non-homogeneous linear equation} \\
\end{array}\right.$
- Example
$A\vec{x} = \vec{b},\ A = \left[\begin{array}{c c c c}
1 & 3 & 0 & 2 \\
0 & 0 & 1 & 4 \\
1 & 3 & 1 & 6 \\
\end{array}\right],\ \vec{b} = \left[\begin{array}{c}
b_1 \\
b_2 \\
b_3 \\
\end{array}\right] \\
\left[\begin{array}{c|c}
A & \vec{b} \\
\end{array}\right] = \left[\begin{array}{c c c c|c}
1 & 3 & 0 & 2 & b_1 \\
0 & 0 & 1 & 4 & b_2 \\
1 & 3 & 1 & 6 & b_3 \\
\end{array}\right] \to \left[\begin{array}{c c c c|c}
1 & 3 & 0 & 2 & b_1 \\
0 & 0 & 1 & 4 & b_2 \\
0 & 0 & 0 & 0 & b_3-b_2-b_1 \\
\end{array}\right]$
$A\vec{x}=\vec{b}$ has solutions if and only if $b_3 - b_2 - b_1 = 0 \implies b_3 = b_1 + b_2$
Assume $B_3 = b_1+b_2,\ \left\{\begin{array}{c}
x_1+3x_2+2x_4 = b_1 \\
x_3+4x_4=b_2 \\
\end{array}\right. \implies \left\{\begin{array}{c}
x_1 = -3x_2-2x_4+b_1 \\
x_3=-4x_4+b_2 \\
\end{array}\right. \\
\begin{split}\vec{x} = \left[\begin{array}{c}
x_1 \\
x_2 \\
x_3 \\
x_4 \\
\end{array}\right] &= \left[\begin{array}{c}
-3x_2-2x_4+b_1 \\
x_2 \\
-4x_4+b_2 \\
x_4 \\
\end{array}\right] \\
&= \underbrace{x_2\left[\begin{array}{c}
-3 \\
1 \\
0 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
-2 \\
0 \\
-4 \\
1 \\
\end{array}\right]}_{\text{element of }N(A)} + \underbrace{\left[\begin{array}{c}
b_1 \\
0 \\
b_2 \\
0 \\
\end{array}\right]}_\text{particular solution}\end{split}$
1. $\text{when }b_1=b_2=0,\ \vec{b} = \left[\begin{array}{c}
b_1 \\
b_2 \\
b_3 \\
\end{array}\right] = \left[\begin{array}{c}
b_1 \\
b_2 \\
b_1+b_2 \\
\end{array}\right] = \vec{0} \\
\vec{x}_n \triangleq x_2\left[\begin{array}{c}
-3 \\
1 \\
0 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
-2 \\
0 \\
-4 \\
1 \\
\end{array}\right] \in N(A) \text{ is a solution } A\vec{x} = \vec{0} \\
\text{In fact } N(A) = \{x_2\left[\begin{array}{c}
-3 \\
1 \\
0 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
-2 \\
0 \\
-4 \\
1 \\
\end{array}\right] \mid x_2,x_4 \in \mathbb{R}\}$
2. For any case of $x_2,x_4$, $x_2\left[\begin{array}{c}
-3 \\
1 \\
0 \\
0 \\
\end{array}\right] + x_4\left[\begin{array}{c}
-2 \\
0 \\
-4 \\
1 \\
\end{array}\right] + \left[\begin{array}{c}
b_1 \\
0 \\
b_2 \\
0 \\
\end{array}\right]$ is a solution to $A\vec{x}=\vec{b}$, setting $x_2=x_4=0, \vec{x}_p \triangleq \left[\begin{array}{c}
b_1 \\
0 \\
b_2 \\
0 \\
\end{array}\right]$ is a particular solution to $A\vec{x}=\vec{b}$
3. The solution space of $A\vec{x}=\vec{b}$ is $\mathbb{K} = \{\vec{x}_p\}+\mathbb{K_H}$ where $\mathbb{K_H}$ is the solution space of homogeneous equation $A\vec{x}=\vec{0}$, i.e. $N(A)$
- Theorem
Let $\mathbb{K}$ be the solution set of $A\vec{x}=\vec{b}$, $\mathbb{K_H}$ be the solution set of the corresponding homogeneous system $(A\vec{x} = \vec{0})$, then for any solution $\vec{x}_p$ to $A\vec{x} = \vec{b}$, $\mathbb{K} = \vec{x}_p + \mathbb{K_H} = \{\vec{x}_p + \vec{x}_n \mid \vec{x}_n \in \mathbb{K_H}\}$
- Proof
Let $\vec{x}_p$ be the solution s.t. $A\vec{x}_p = \vec{b}$
Let $\vec{x'}$ be a solution to $A\vec{x} = \vec{b}$
$A\vec{x'} = \vec{b}$ i.e. $\vec{x'} \in \mathbb{K}$
$A(\vec{x'}-\vec{x}_p) = A\vec{x'}-A\vec{x}_p = \vec{b} - \vec{b} = \vec{0}$
$\therefore \vec{x'} - \vec{x}_p$ is a solution to $A\vec{x}=\vec{0} \implies \vec{x'} - \vec{x}_p \in \mathbb{K_H}$
Let $\vec{x}_n = \vec{x'} - \vec{x}_p \in \mathbb{K_H} \implies \vec{x'} = \vec{x}_p + \vec{x}_n$
$\therefore \mathbb{K} \subseteq \{ \vec{x}_p \} + \mathbb{K_H}\ ...(i)$
Next, let $\vec{x'} \in \{ \vec{x}_p \} + \mathbb{K_H} \implies \vec{x'} = \vec{x}_p + \vec{x}_n$ for some $\vec{x}_n \in \mathbb{K_H}$
$A\vec{x'} = A\vec{x}_p + A\vec{x}_n = \vec{b} + \vec{0}$
$\therefore \vec{x'}$ is a solution to $A\vec{x} = \vec{b} \implies \vec{x'} \in \mathbb{K} \implies \{ \vec{x}_p \} + \mathbb{K_H} \subseteq \mathbb{K}\ ...(ii)$
From $(i)$ and $(ii)$, $\mathbb{K} = \{ \vec{x}_p \} + \mathbb{K_H}$