---
title: "MIT OCW 18.06SC Linear Algebra - Unit 1: Ax=b and the 4 foundamental subspaces"
---
# <u>MIT OCW 18.06SC Linear Algebra - Unit 1 </u><br><span>Ax=b and the 4 foundamental subspaces</span>
###### tags: `mit_ocw`, `linear_algebra`, `note`
<style>
.markdown-body h1 > span:last-child{
font-size: 18pt;
float: right;
width: 100%;
text-align: right;
border-bottom: 1pt #EEE solid;
padding-bottom: 5pt;
margin-bottom: 5pt
}
.markdown-body h1{
border: none;
}
.markdown-body h2{
border-bottom: double #DC965A80 4px;
display: block;
position: relative;
width: 100%;
line-height: 15pt;
}
.markdown-body h2::after{
--size: 8pt;
content: '';
position: absolute;
right: 0;
width: 0;
height: 0;
border-style: solid;
border-width: 0 var(--size) var(--size) 0;
border-color: transparent khaki transparent transparent;
}
.markdown-body strong{
color: darkseagreen;
text-transform: uppercase;
}
.markdown-body{
tab-size: 2;
}
</style>
$$
\require{enclose}
\require{textmarcos}
\newcommand\hlight[1]{\enclose{roundedbox}[mathcolor=gold]{#1}}
\newcommand\bmat[1]{\begin{bmatrix}#1\end{bmatrix}}
\newcommand\inv[1]{#1^{\text{-}1}}
$$
## The Geometry of Linear Equations
A set of linear equations $\begin{cases} 2x - y &= 0 \\ -x + 2y &= 3 \\\end{cases}$ can be transform into the **MATRIX FORM:**
$$
\begin{array}{ccc}
A & \mathbf{x} & & \mathbf{b} \\
\text{(Coefficents)} & \text{(Unknowns)} & & \\
\begin{bmatrix}
2 & -1 \\ -1 & 2
\end{bmatrix} &
\begin{bmatrix}
x \\ y
\end{bmatrix} & = &
\begin{bmatrix}
0 \\ 3
\end{bmatrix}
\end{array}
$$
## The Row View
We can use **VECTORS** to descrive the above equations as:
$$
x\begin{bmatrix}
2 \\ -1
\end{bmatrix}
+y\begin{bmatrix}
-1 \\ 2
\end{bmatrix}=
\begin{bmatrix}
0 \\ 3
\end{bmatrix}
$$
Trying to combine the vectors at the LHS with the right amount to get the vector at the RHS,
is called <u>to find the **LINEAR COMBINATIONS**.</u>
## Linear Dependency
<b>Q:</b> Do the linear combinations fill every point in the space?
+ 〇
+ The vectors are **LINEARLY INDEPENDENT**
+ ✖
+ The vectors are **LINEARLY DEPENDENT**
+ The matrix containing those vectors is **SINGULAR**
## Vectors
Using the vectors $\mathbf{u}, \mathbf{v}$ and $\mathbf{w}$, we can construct the linear combination: $x_1\mathbf{u}+x_2\mathbf{v}+x_3\mathbf{w}=\mathbf{b}$, where $x_!, x_2$ and $x_3$ are called **scalars.**
An example in $\mathbb{R}^3$ is $\mathbf{u} = \begin{bmatrix} 1 \\ -1 \\ 0 \end{bmatrix}, \mathbf{v} = \begin{bmatrix} 0 \\ 1 \\ -1 \end{bmatrix}, \mathbf{w} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}$.
The linear combinations of some vectors creates a **subspace.**
## Matrices
Putting vectors together into colums, we can create a **matrix**, and the product:
$$
A\mathbf{x}=\begin{bmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & -1 & 1 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix}=\begin{bmatrix} x_1 \\ -x_1+x_2 \\ -x_2+x_3 \end{bmatrix}=\begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}=\mathbf{b}
$$
Equals the sum of vectors: $x_1\mathbf{u}+x_2\mathbf{v}+x_3\mathbf{w}=\mathbf{b}$.
Solving $\mathbf{x}$ is equivalent to solving the **INVERSE MATRIX** of $A$.
$$
\mathbf{x}=\begin{bmatrix}x_1 \\ x_2 \\ x_3\end{bmatrix}\begin{bmatrix} b_1 \\ b_1+b_2 \\ b_1+b_2+b_3 \end{bmatrix} = \begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 1 & 1 & 1 \end{bmatrix}\begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}=A^{-1}\mathbf{b}
$$
:::info
Note that:
+ Not all matrices are invertiable.
+ While matrix $A$ represents the transform $\mathbf{x\rightarrow b}$, $A^{-1}$ represents $\mathbf{b\rightarrow x}$.
:::
## Subspaces
+ The **basis** of $\mathbb{R}^n$ is $n$ independent vectors in $\mathbb{R}^n$
+ A **vector space** is closed by linear combination
+ A **subspace** is a vector space under some vector space
The subspaces of $\mathbb{R}^3$ are:
+ The entire $\mathbb{R}^3$
+ Planes through origin
+ Line through origin
+ The origin
## Elimination
For a coefficent matrix $A$, we could repeat the following task:
1. Choose the element at $(n, n)$ as a **pivot**. If $(n, n)=0$, swap some row under it.
2. Multiply and subtract the pivot row from all other rows so that on the pivot column, only the pivot is non-zero.
To create a **upper triangular matrix**.
$$
\begin{array}{ccc} A & & & & U \\
\begin{bmatrix} \hlight{1} & 2 & 1 \\ 3 & 8 & 0 \\ 0 & 4 & 1 \end{bmatrix} & \Rightarrow &
\begin{bmatrix} \hlight{1} & 2 & 1 \\ 0 & \hlight{2} & -2 \\ 0 & 4 & 1 \end{bmatrix} & \Rightarrow &
\begin{bmatrix} \hlight{1} & 2 & 1 \\ 0 & \hlight{2} & -2 \\ 0 & 0 & \hlight{5} \end{bmatrix}
\end{array}
$$
The above elimination process can be represented by products of **elimination matrices**:
$$\begin{array}{ccc} E_{32} & E_{21} & A & & U \\
\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & -2 & 1 \end{bmatrix} &
\begin{bmatrix} 1 & 0 & 0 \\ -3 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} &
\begin{bmatrix} 1 & 2 & 1 \\ 3 & 8 & 1 \\ 0 & 4 & 1 \end{bmatrix} & = &
\begin{bmatrix} 1 & 2 & 1 \\ 0 & 2 & -2 \\ 0 & 0 & 5 \end{bmatrix} &
\end{array}
$$
## Permutation Matrices
A **permutation matrix** could swap 2 column or rows of a matrix, according to the order of multiplication.
$$
P=\begin{bmatrix}1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0\end{bmatrix}, \begin{cases}PA \text{ — swaps row 2 and row 3 of } A \\ AP \text{ — swaps column 2 and column 3 of } A\end{cases}
$$
The matrix can be constructed by swapping 2 rows(columns) of a identity matrix.
## Multiplying Matrices (AB=C)
### Standard
$$
C_{ij}=\displaystyle\sum^{n}_{k=1}A_{ik}B_{kj}
$$
### Columns
See matrix $A$ as columns($a_i$); the columns of $C$ are combinations of $a_i$.
### Rows
See matrix $B$ as Rows($b_i$); the rows of $C$ are combinations of $b_i$.
### Column Times Row
$$
\bmat{2 & 1 \\ 3 & 3 \\ 4 & 5}\bmat{1 & 6 \\ 3 & 2} = \bmat{2 \\ 3 \\ 4}\bmat{1 & 6} + \bmat{1 \\ 3 \\ 5}\bmat{3 & 2}
$$
### Blocks
$$
\bmat{A_1 & A_3 \\ A_2 & A_4}\bmat{B_1 & B_3 \\ B_2 & B_4}=\bmat{A_1B_1+A_3B_2 & A_1B_3+A_3B_4 \\ A_2B_1+A_4B_2 & A_2B_3+A_4B_4}
$$
## Inverses
$A^{-1}$ is the **Inverse matrix** of $A$, that satisfies $A^{-1}A=I=AA^{-1}$.
If $A^{-1}$exists, then $A$ is **INVERTIBLE** and **nonsingular**.
If $A$ is **SINGULAR**, then there exists $\mathbf{x}\neq 0$ that $A\mathbf{x}=0$. For example,
$A=\bmat{1 & 3 \\ 2 & 6}\Rightarrow \bmat{1 & 3 \\ 2 & 6}\bmat{3 \\ -1}=\bmat{0 \\ 0} \Rightarrow A$ is singular.
## Gouss-Tordan Elimination
For the augmented matrix $\bmat{A & I}$, if we use elimination to produce $\bmat{I & M}$, then $M=A^{-1}$
## Inverse of Product
Suppose $M=(AB)^{-1}$, we know that
$$
\begin{align}
&&(AB)M& &=& &I \\
\Rightarrow& &\enclose{updiagonalstrike}{\inv{A}A}BM& &=& &\inv{A}I \\
\Rightarrow& &\enclose{updiagonalstrike}{\inv{B}B}M& &=& &\inv{B}\inv{A}I
\end{align}
$$
## Transpose of Product
Suppose $M=(AB)^T$, then $M=B^TA^T$. If $A$ is invertible, then $(\inv{A})^T=\inv{(A^T)}$.
## From EA=U to A=LU
From representing the result of elimination as $E_{_{i_1j_1}}E_{_{i_1j_2}}\dots E_{_{i_nj_n}}A=U$, we could also multyply the equation by inverses of the elimination matrices, namely
$$
A=\inv{E_{_{i_nj_n}}}\inv{E_{_{i_{n-1}j_n}}}\inv{E_{_{i_{n-1}j_{n-1}}}}\dots \inv{E_{_{i_nj_n}}}U=LU
$$
which $L$ is easier to compute.
+ To compute $E_{-1}$, negate the only non-zero term at the lower-triangular part.
+ To compute $L$ simply combine the non-zero term at the lower triangular part of the components.
$$
E_{21}=\bmat{1 & 0 & 0 \\ -2 & 1 & 0 \\ 0 & 0 & 1},
E_{31}=\bmat{1 & 0 & 0 \\ 0 & 1 & 0 \\ -5 & 0 & 1},
E_{32}=\bmat{1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & -3 & 1}, \\
E_{32}E_{31}E_{21}A=U
\Rightarrow L = \inv{E_{21}}\inv{E_{31}}\inv{E_{32}}=
\bmat{1 & 0 & 0 \\ 2 & 1 & 0 \\ 0 & 0 & 1}\bmat{1 & 0 & 0 \\ 0 & 1 & 0 \\ 5 & 0 & 1}\bmat{1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 3 & 1}=\bmat{1 & 0 & 0 \\ 2 & 1 & 0 \\ 5 & 3 & 1}
$$
## Multiplicative Group
When swapping is required during elimination, it can be represented by multiplying to a **Permutation matrix**.
All possible permutation matrices under $n\times n$ is the **Multiplicative group**, containing $n!$ different matrices. For an example in $n=3$, the group is:
$$
\begin{Bmatrix}
\bmat{1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1}
\bmat{0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1}
\bmat{0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0}
\bmat{1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0}
\bmat{0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0}
\bmat{0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0}
\end{Bmatrix}
$$
The permutation matrices satisfy $P^T=P^I$.
## Symmetric Matrix
Matrices that staisfies $A_T=A$ are called **Symmetric Matrices**.
for any matrix $R$, $R_TR$ is always symmetric:
$$
(R^TR)^T=(R)^T(R^T)^T=R^TR
$$
## Vector Spacees
If a set of vectors are **Closed** under linear combinations, the set is a **Vector space**.
$$
\displaystyle\forall_{\mathbf{a},\mathbf{b}\ \in V}\displaystyle\forall_{c, d\in\mathbb{R}}:c\mathbf{a}+d\mathbf{b}\in V
$$
## Column Space of Matrices
For a matrix $A=\bmat{\mathbf{v_1v_2\dots v_n}}$, the linear combination of all $\mathbf{v}$ forms the **Column Space** of $A$, $C(A)$.
For a equation $A\mathbf{x=b}$, iff $\mathbf{b}\in C(A)$ the equation is solvable.
## Nullspace of Matrices
For the equation $A\mathbf{x}=0$, the subspace formed by all solution for $\mathbf{x}$ is the **Nullspace** of $A$, $N(A)$.
## Reduces Row Echelon Form
Form an upper triangular form, we can further reduce into the form that:
+ All pivots are $1$
+ Everything else within the pivot column are $0$
Then, we call it the **Reduced row echelon form**
$$
\begin{align}
A&=\bmat{\hlight{1} & 2 & 2 & 2 \\ 2 & 4 & 6 & 8 \\ 3 & 6 & 8 & 10}\rightarrow
\bmat{\hlight{1} & 2 & 2 & 2 \\ 0 & 0 & \hlight{2} & 4 \\ 0 & 0 & 2 & 4}\rightarrow
\bmat{\hlight{1} & 2 & 2 & 2 \\ 0 & 0 & \hlight{2} & 4 \\ 0 & 0 & 0 & 0} = U \\
&\rightarrow\bmat{1 & 2 & 0 & -2 \\ 0 & 0 & \hlight{2} & 4 \\ 0 & 0 & 0 & 0}\rightarrow
\bmat{\hlight{1} & 2 & 0 & -2 \\ 0 & 0 & \hlight{1} & 2 \\ 0 & 0 & 0 & 0} = \bmat{I & F \\ 0 & 0} = R = \text{rref}(A)
\end{align}
$$
The **Pivot columns** forms an identy matrix $I$, and the **Free columns** forms the free matrix $F$.
The number of pivots is the **Rank** of the matrix.
## Compute the Nullspace
For each variable corresponding to the free columns, set each to $1$ and others $0$.
Then substuting to $\text{rref}(A)$ we obtain 1 solution for $A\mathbf{x=b}$, called a **Special solution**. The linear combination of all special solutions forms the **Nullspace**.
$$
\text{rref}(A)=\bmat{1 & 2 & 0 & -2 \\ 0 & 0 & 1 & 2} \Rightarrow c_1\bmat{1 \\ 0 \\ -2 \\ 1}+c_2\bmat{-2 \\ 1 \\ 0 \\ 0}
$$
:::info
if $\text{rref}(A)=\bmat{I & F \\ 0 & 0}$, then the **basis**(see below for def.) of $N(A)$ is columns of $\bmat{-F \\ I}$.
:::
## Solve Ax=b
To solve the equation, we try to find a **Particular Solution** and the nullsapce.
The particular solution could be obtained by:
1. Evaluate matrix $\bmat{A & \bigm| & \mathbf{b}}$ into $\text{rref}(A)=\bmat{R & \bigm| & \mathbf{d}}$
2. Substitute the variables corresponds to the free columns to zero, calculate the other variables.
3. The obtained $\bmat{x_1, x_2 \dots x_n}^T$ is $\mathbf{x}_p$
The **Complete Solution** is $\mathbf{x}_p+c\mathbf{x}_n$, which $cx_n$ is th8e linear combination of special solutions.
## Ranks
For a $m\times n$ matrix $A$, the **rank** of $A$ is the number of pivots.
$\text{rank}(A) \leq m, \text{rank}(A) \leq n$
+ If $\text{rank}(A) = n$, there is ==0 or 1 solutions==, $\text{rref}(A)=\bmat{I \\ 0}$, and $\mathbf{b}\in C(A)$.
+ If $\text{rank}(A) = m$, there are ==infinite solutions==, $\text{rref}(A)=\bmat{I & F}$. no restrictions for $\mathbf{b}$, exists $n-m$ solutions.
+ If $\text{rank}(A) = m = n$, there is ==$1$ unique solution== for all $b\in \mathbb{R}^m$. $N(A)=\mathbb{R}^0$, $\inv{A}$ exists and $\text{rref}(A)=I$.
+ Else, there are ==1 or infinite solutions==, $\text{rref}(A)=\bmat{T & F \\ 0 & 0}$
## Linear Independence
For vectors $\mathbf{x}_1,\mathbf{x}_2,\dots \mathbf{x}_n$ if $c_1\mathbf{x}_1+c_2\mathbf{x}_2+\dots+c_n\mathbf{x}_n=0$ only when $c_1=c_2=\dots=c_n=0$, then the vectors are called **Linearly independent**.
## Spanning Space
If vectors $\mathbf{v}_1, \mathbf{v}_2 \dots \mathbf{v}_n$ **Spans** space $S$,
+ The space consists of linear combination of the vectors.
+ The space is the smallest containing the vectors.
## Basis of Vector Space
The **Basis** of an vector space is a set of vectors that are
+ Independent
+ Spans the vector space
## Dimension of a Space
The **Dimension** of a space is the number of vectors of the basis of the space.
For a $m\times n$ matrix $A$,
$$
\begin{align}
\text{rank}(A) &= \text{dim}(C(A)) \\
\text{dim}(N(A))&=n-\text{rank}(A)
\end{align}
$$
## Four Foundamental Spaces
They are for $m\times n$ matrix $A$,
+ **Column Space** $C(A)$
+ **Nullspace** $N(A)$
+ **Row Space** $C(A^T)$ are the linear combination of row of $A$
+ **Left Nullspace** $N(A^T)$ is the of solutions of $A^T\mathbf{x}=0$
## Basis & Dimension of Foundamental Spaces
### Column Space $C(A)$
+ $\text{dim}(C(A)) =$ number of pivot columns $= \text{tank}(A)$
+ $\text{basis}(C(A)) =$ set of columns of $A$ where the pivot column locates
### Nullspace $N(A)$
+ $\text{dim}(N(A)) =$ number of free columns $= n-\text{tank}(A)$
+ $\text{basis}(N(A)) =$ set of columns of $A\mathbf{x}=0$ when setting each free var to 0
### Row Space $C(A^T)$
+ $\text{dim}(C(A^T)) =$ number of non-zero rows in $\text{rref}(A)=\text{rank(A)}$
+ $\text{basis}(C(A^T)) =$ set of non-zero rows in $\text{rref}(A)$, transposed.
### Left Nullspace $N(A^T)$
+ $\text{dim}(N(A^T)) =$ dimension of nullspace of $A^T=m-\text{rank}(A)$
+ $\text{basis}(N(A^T)) =$ set of rows in $E$ producing zero-rows, transposed.
## Intersection and Sum of Spaces
For 2 spaces $U, S$, the following statement is always true:
$$
\text{dim}(S)+\text{dim}(U)=\text{dim}(S\cup U)+\text{dim}(S+U)
$$