# 線代共筆10/12
S : 2x+5y+7z=0 的所有(x, y, z)
之前的筆記:[09/28](https://hackmd.io/GSl_VKZESuu8ctvSP83IYA)、[10/05](https://hackmd.io/R35G-Rl1SmefNgLOPAwMRg)
下週([10/19](https://hackmd.io/FcolQoiPSmiRQCQ6MbY6jQ))
HW#1 Gradescope 10/19 (11:59am)
## 1.5 Triangular Factors and Row Exchanges
$Ax=b$
$\Rightarrow LUx=b$
$\;\;\;\rightarrow Lc=b$
$\;\;\;\rightarrow Ux=c$
eg: $Ax=
\left[\begin{array}{rrr}2&4&-2\\4&9&-3\\-2&-3&7\end{array}\right]
\left[\begin{array}{c}u\\v\\w\end{array}\right]=
\left[\begin{array}{r}2\\8\\10\end{array}\right]=b$
$\left[\begin{array}{rrrr}2&4&-2&2\\4&9&-3&8\\-2&-3&7&10\end{array}\right]\longrightarrow
\left[\begin{array}{rrrr}2&4&-2&2\\0&1&1&4\\0&1&5&12\end{array}\right]\longrightarrow
\left[\begin{array}{rrrr}2&4&-2&2\\0&1&1&4\\0&0&4&8\end{array}\right]$
$E_{i,j}A$
$E_{i,j}$: $(i$ th row$) + (-l)(j$ th row$)$
$l$: multiplier
$E=\left[\begin{array}{rrr}1&0&0\\\color{orange}{-2}&1&0\\\color{orange}{0}&\color{orange}{0}&1\end{array}\right];
F=\left[\begin{array}{rrr}1&0&0\\\color{orange}{0}&1&0\\\color{orange}{1}&\color{orange}{0}&1\end{array}\right];
G=\left[\begin{array}{rrr}1&0&0\\\color{orange}{0}&1&0\\\color{orange}{0}&\color{orange}{-1}&1\end{array}\right]$
i.e. $GFEAx=\left[\begin{array}{rrr}2&4&-2\\0&1&1\\0&0&4\end{array}\right]
\left[\begin{array}{r}u\\v\\w\end{array}\right]=Ux=C
=\left[\begin{array}{r}2\\4\\8\end{array}\right]=GFEb$
Question: How can we undo the steps of Gaussian Elimination?
$$
E^{-1}F^{-1}G^{-1}GFEA = A = E^{-1}F^{-1}G^{-1}U=LU\text{ i.e. }A=LU
$$
$$
E^{-1} = \begin{bmatrix}
1 & 0 & 0 \\
+2 & 1 & 0 \\
0 & 0 & 1
\end{bmatrix};
F^{-1} = \begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
-1 & 0 & 1
\end{bmatrix};
G^{-1} = \begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & +1 & 1
\end{bmatrix}
$$
$$
L = E^{-1}F^{-1}G^{-1} = \begin{bmatrix}
1 & 0 & 0 \\
2 & 1 & 0 \\
-1 & 1 & 1
\end{bmatrix}
$$
### Triangular Factorization $A=LU$
If no row exchanges are required, the original matrix $A$ can be written as $A=LU$.
The matrix $L$ is lower triangle with 1's on the diagonal and the multipliers $l_{ij}$ (taken from elimination) below the diagonal.
$U$ is the upper triangular matrix which appears after forward elimination & before back-substitution; its diagonal entries are pivots!
$$
\text{e.g. }
A = \begin{bmatrix}
2 & -1 & -1 \\
0 & -4 & 2 \\
6 & -3 & 0
\end{bmatrix} =
\begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
3 & 0 & 1
\end{bmatrix}
\begin{bmatrix}
2 & -1 & -1 \\
0 & -4 & 2 \\
0 & 0 & 3
\end{bmatrix} =
\begin{bmatrix}
2 & 0 & 0 \\
0 & -4 & 0 \\
6 & 0 & 3
\end{bmatrix}
\begin{bmatrix}
1 & \frac{-1}{2} & \frac{-1}{2} \\
0 & 1 & \frac{-1}{2} \\
0 & 0 & 1
\end{bmatrix}
$$
#### Note
LU decomposition is $\color{orange}{NOT}$ unique.
One Linear System = Two Triangular Systems
$Ax = b \rightarrow Ux = c \text{ & } Lc = b$
Remark:
* The $LU$ form is unsymmetric in one aspect $U$ has pivots along its diagonal where $L$ always has $1$'s.
* We can rewrite $U$ as
$$
U = \begin{bmatrix}
d_1 & & & 0 \\
& d_2 & & \\
& & \ddots & \\
0 & & & d_n
\end{bmatrix}
\begin{bmatrix}
1 & \frac{u_{12}}{d_1} & \frac{u_{13}}{d_1} & \cdots \\
& 1 & \frac{u_{23}}{d_2} & \cdots \\
& & 1 & \\
0 & & & 1
\end{bmatrix}
$$
$$
\text{e.g. }
A = \begin{bmatrix}
3 & 4 \\
1 & 2
\end{bmatrix} =
\begin{bmatrix}
1 & 0 \\
1/3 & 1
\end{bmatrix}
\begin{bmatrix}
3 & 4 \\
0 & 2/3
\end{bmatrix} =
\begin{bmatrix}
1 & 0 \\
1/3 & 1
\end{bmatrix}
\begin{bmatrix}
3 & 0 \\
0 & 2/3
\end{bmatrix}
\begin{bmatrix}
1 & 4/3 \\
0 & 1
\end{bmatrix} =
LDU
$$
#### Theorem
If $A = L_1D_1U_1$ and $A=L_2D_2U_2$
then $L_1=L_2, D_1=D_2, U_1=U_2$
That is, if $A$ has LDU decomposition, then it <span style="color: red">is</span> unique
### Row Exchange and Permutation Matrices
$$P_{23} = \begin{bmatrix}
1 & 0 & 0 \\
0 & 0 & 1 \\
0 & 1 & 0
\end{bmatrix}$$
$$1 \begin{bmatrix}
1 & 0 & 0 \\
0 & 0 & 1 \\
0 & 1 & 0
\end{bmatrix}\begin{bmatrix}
1 \\
3 \\
5
\end{bmatrix} = \begin{bmatrix}
1 \\
5 \\
3
\end{bmatrix}$$
$$2 PA=\begin{bmatrix}
1 & 0 & 0 \\
0 & 0 & 1 \\
0 & 1 & 0
\end{bmatrix}\begin{bmatrix}
2 & 4 & 1 \\
0 & 0 & 3 \\
0 & 6 & 5
\end{bmatrix} = \begin{bmatrix}
2 & 4 & 1 \\
0 & 6 & 5 \\
0 & 0 & 3
\end{bmatrix}
$$

#### Note
1. PA : performing row exchange of A
2. BP : performing column exchange of B
3. PAx = Pb
- Should we permute the component of $x = \begin{bmatrix}u\\v\\w\end{bmatrix}$ as well? <span style="color: red">No!</span>
eg :
$$
A = \begin{bmatrix}
0 & a & b \\
0 & 0 & c \\
d & e & f
\end{bmatrix} _ {3x3}
$$
- if $d=0$, the problem is incurable. The matrix is singular.
- if $d\neq 0$,
$P_{13}A = \begin{bmatrix}
d & e & f \\
0 & 0 & c \\
0 & a & b
\end{bmatrix}$
if $a\neq 0$,
$P_{23}P_{13}A=\begin{bmatrix}
d & e & f \\
0 & a & b \\
0 & 0 & c
\end{bmatrix}$
$P_{23}P_{13}\neq P_{13}P_{23}$
#### 1J
* In the nonsingular case, there's a permutation matrix $P$ that reorders the rows of $A$ to avoid zeros in the pivot positions. In this case,
* $Ax = b$ has a unique solution.
* It is found by elimination w/ row exchanges.
* With the rows reordered in advance, $PA$ can be factored into LU(PA=LU).
* In singular case, no reordering can produce a full set of pivots.
Ex
$$
A = \begin{bmatrix}
1 & 1 & 1\\
1 & 1 & 3\\
2 & 5 & 8
\end{bmatrix}
\xrightarrow[l_{31}=2]{l_{21}=1}
\begin{bmatrix}
1 & 1 & 1\\
0 & 0 & 2\\
0 & 3 & 6
\end{bmatrix}
\xrightarrow[]{P_{23}}
\begin{bmatrix}
1 & 1 & 1\\
0 & 3 & 6\\
0 & 0 & 2
\end{bmatrix}
$$
$$
L \neq \begin{bmatrix}
1 & 0 & 0\\
1 & 1 & 0\\
2 & 0 & 1
\end{bmatrix};
L = \begin{bmatrix}
1 & 0 & 0\\
2 & 1 & 0\\
1 & 0 & 1
\end{bmatrix}
$$
<span style="color: green">PA=LU</span>
<!--  -->
To summarize:
A good code for Gaussian Elimination keeps a record of $L$, $U$ & $P$. They allow the solution $(Ax = b)$ from two triangular systems. If the system $(Ax=b)$ has a unique solution, then we say:
1. the system is nonsingular or
2. the matrix is nonsingular
Otherwise, it is singular.
## 1.6 Inverse and Transpose
### Definition
An $n\times n$ matrix $A$ is invertible
if there is an $n\times n$ matrix $B$ such that $BA=I=AB$
#### 1K
If $A$ is invertible, then the matrix $B$ satisfying $AB = BA = I$ is unique.
proof: Suppose $\exists C \neq B$, $AC = CA = I$
$B = BI = B(AC) = (BA)C = IC = C$ i.e. $B = C$
we call this $B$ the inverse of $A$ and is denoted as $A^{-1}$
#### Note
- Not all $n\times n$ matrices have inverse.
- $1^\circ \begin{bmatrix}
0 & 0 \\
0 & 0
\end{bmatrix} _ {2\times 2}$
- $2^\circ$ iff $Ax = 0$ has a nonzero solution$(x\neq 0)$, then $A$ has no inverse!
Otherwise since $x = A^{-1}(Ax) = A^{-1}0 = 0$ there will be a contradiciton.
- The inverse of $A^{-1}$ is $A$ itself. i.e $(A^{-1})^{-1} = A$
- If $A = \begin{bmatrix}a\end{bmatrix}_{1 \times 1}$ and $a \neq 0$, then $A^{-1} = \begin{bmatrix}\frac{1}{a}\end{bmatrix}$
-
- $A = \begin{bmatrix} d_1 & & 0 \\ & \ddots & \\ 0 & & d_n \end{bmatrix}$ and $\forall i, d_i \neq 0$,$A^{-1} = \begin{bmatrix}\frac{1}{d_1} & & & 0 \\ & \frac{1}{d_2} & & \\ & & \ddots & \\ 0 & & & \frac{1}{d_n}\end{bmatrix}$
#### proposition(1L)
If $A$ and $B$ are invertible, $(AB)^{-1} = B^{-1}A^{-1}$. $(A_1A_2A_3\dots A_n)^{-1} = A_n^{-1} A_{n-1}^{-1}A_{n-2}^{-1}\dots A_1^{-1}$
Note that $(AB)^{-1} \color{red}{\huge \neq} A^{-1}B^{-1}$
### The calculation of $A^{-1}$

$A_{n\times n} A^{-1}_{n\times n} = I_{n\times n}$
$A_{n\times n} B_{n\times n} = I_n$
$\implies A_{n\times n}[B_1|B_2|...|B_n] = [e_1|e_2|...|e_n]_{nxn}$
$\implies [AB_1|AB_2|...|AB_n] = [e_1|e_2|...|e_n]_{nxn}$
$\implies AB_1 = e_1; AB_2 = e_2, ... AB_n = e_n$
#### Gauss-Jordan Method
Instead of stopping at $U$ and switch to back-substitution, it continues by subtracting multiples of a row from the rows above till it reaches a diagonal matrix, then we divide ech row by the corresponding pivot.
$$
[A|I] \xrightarrow{\color{blue}{\times L^{-1}}} [U|\color{blue}{L^{-1}}] \xrightarrow{\color{blue}{\times U^{-1}}} [I|\color{blue}{A^{-1}}]
$$

## Invertible=Nonsingular
Question: What kind of matrices are invertible?
Answer:
1. independent columns(rows)(Ch2)
2. nonzero determinants(Ch4)
3. nonzero eigenvalues(Ch5)
4. nonzero pivots