# LU Decomposition
### The Big Idea
LU decomposition is an efficient way to solve a system of linear equations, $A \mathbf{x} = \mathbf{b}$, especially when you need to solve many systems with the same matrix $A$ but different vectors $\mathbf{b}$. Instead of using Gaussian elimination on each augmented matrix, we first decompose A into a **lower triangular matrix** $L$ and an **upper triangular matrix** $U$.
With the decomposition $A=LU$, the system $A \mathbf{x} = \mathbf{b}$ becomes $LU \mathbf{x} = \mathbf{b}$. We can then solve this in two simple steps using two-part substitution:
1. Solve for an intermediate vector $\mathbf{y}$ in the system $L\mathbf{y} = \mathbf{b}$ using **forward substitution**.
2. Solve for the final solution $\mathbf{x}$ in the system $U\mathbf{x}=\mathbf{y}$ using backward substitution.

## Definition and Properties
:::info
**Definition**
1. A **lower triangular matrix** is a matrix with zeros above the diagonal. For example:
$$
\begin{bmatrix} * & & & \\ * & * & & \\ * & * & * & \\ * & * & * & * \end{bmatrix}
\hspace{10mm}
\begin{bmatrix} * & & \\ * & * & \\ * & * & * \\ * & * & * \\ * & * & * \end{bmatrix}
\hspace{10mm}
\begin{bmatrix} * & & & & \\ * & * & & & \\ * & * & * & & \end{bmatrix}
$$
2. A **unit lower triangular matrix** is a square matrix with ones on the diagonal and zeros above the diagonal. For example:
$$
\begin{bmatrix} 1 & & \\ * & 1 & & \\ * & * & 1 & \\ * & * & * & 1 \end{bmatrix}
$$
3. An **upper triangular matrix** is a matrix with zeros below the diagonal. For example:
$$
\begin{bmatrix} * & * & * & * \\ & * & * & * \\ & & * & * \\ & & & * \end{bmatrix}
\hspace{10mm}
\begin{bmatrix} * & * & * \\ & * & * \\ & & * \\ & & \\ & & \end{bmatrix}
\hspace{10mm}
\begin{bmatrix} * & * & * & * & * \\ & * & * & * & * \\ & & * & * & * \end{bmatrix}
$$
4. A **unit upper triangular matrix** is a *square* matrix with ones on the diagonal and zeros below the diagonal. For example:
$$
\begin{bmatrix} 1 & * & * & * \\ & 1 & * & * \\ & & 1 & * \\ & & & 1 \end{bmatrix}
$$
5. An **atomic upper(lower) triangular matrix** is a unit upper(lower) triangular matrix with nonzero entries above(below) the diagonal in only one column. For example:
$$
\begin{bmatrix} 1 & & \\ * & 1 & & \\ * & 0 & 1 & \\ * & 0 & 0 & 1 \end{bmatrix}
\hspace{10mm}
\begin{bmatrix} 1 & 0 & * & 0 \\ & 1 & * & 0 \\ & & 1 & 0 \\ & & & 1 \end{bmatrix}
$$
:::
:::danger
**Theorem**
Let $L_h$ be an atomic lower triangular $m \times m$ matrix with $h$-th column elements below diagnal $l_{h+1,h}\ldots,l_{m,h} \neq 0$, then
1. Left-multiplication by $L_h$ to a matrix corresponds to the operation that adds $l_{i,h}$ times row $h$ to row $i$ for $i=h+1,h+2,\ldots,m$.
2. The inverse of $L_h$ is also an atomic lower triangular matrix. Moreover, the nonzero elements of $L_h$ below the diagonal change sign in $L_h^{-1}$, i.e
$$L_h^{-1}=
\begin{bmatrix}
1 & 0 &\ldots & 0 & \ldots& 0 &0\\
0 & 1 & \ldots &0 & \ldots& 0 &0\\
\vdots& &\ddots & \vdots &&&\vdots \\
0 & 0 &\ldots & 1 &\ldots& 0 &0\\
0 & 0 &\ldots& -l_{h+1,h} &\ddots& 0 &0\\
\vdots & & & \vdots& &\ddots&\vdots\\
0 & 0 &\ldots & -l_{m,h} &0&\ldots& 1
\end{bmatrix}
$$
3. Every unit lower triangular matrix can be decomposed into products of atomic lower triangular matrices, i.e
$$L_1 L_2 \ldots L_{m-1}=
\begin{bmatrix}
1 & 0 &\ldots & 0 & \ldots& 0 &0 \\
l_{2,1} & 1 & \ldots &0 & \ldots& 0&0 \\
\vdots& &\ddots & &&&\vdots\\
l_{h,1} & l_{h,2} &\ldots & 1 &\ldots& 0&0 \\
l_{h+1,1} & l_{h+1,2} &\ldots& l_{h+1,h} &\ddots& 0&0 \\
\vdots & & & \vdots& &\ddots&\vdots\\
l_{m,1} & l_{m,2} &\ldots & l_{m,h} &l_{m,h+1}&\ldots& 1
\end{bmatrix}
$$
:::
:::warning
**Example** Consider
$$
L_1 =
\begin{bmatrix}
1 & 0 & 0 \\
a & 1 & 0 \\
b & 0 & 1
\end{bmatrix},
\quad L_2 =\begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & c & 1
\end{bmatrix} ,
\quad
A =
\begin{bmatrix}
1 & -1 & 1 \\
-1 & 1 & 1 \\
-1 & 2 & 3 \\
\end{bmatrix}
$$
Then the inverse matrices are
$$ L_1^{-1} =
\begin{bmatrix}
1 & 0 & 0 \\
-a & 1 & 0 \\
-b & 0 & 1
\end{bmatrix},\quad
L_2^{-1} = \begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & -c & 1
\end{bmatrix}
$$
and the matrices multiplications are
$$
L_1 L_2 =
\begin{bmatrix}
1 & 0 & 0 \\
a & 1 & 0 \\
b & c & 1
\end{bmatrix}
$$
Therefore
$$
L_1 L_2 A=
\begin{bmatrix} 1 & -1 & 1 \\ a-1 & 1-a & a+1 \\ b-c-1 & -b+c+2 & b+c+3 \end{bmatrix}
$$
:::
## The LU Decomposition Theorem
:::danger
**Theorem**
If a matrix $A$ can be reduced to row echelon form using only the row operation of "adding a multiple of one row to a row below it," then $A$ has an $LU$ decomposition of the form $A=LU$.
* $U$ is the resulting upper triangular matrix from the Gaussian elimination.
* $L$ is a unit lower triangular matrix. Its entries below the diagonal are the negative of the multiples used in the corresponding row operations. For example, if you add c times row $i$ to row $j$, the entry $L_{ji}$ will be $−c$.
:::
:::warning
**Example** Consider
$$A =
\begin{bmatrix}
2 & -6 & -2 & 4 \\
-1 & 3 & 3 & 2 \\
-1 & 3 & 7 & 10
\end{bmatrix}
$$
* To eliminate the first entry in the second row, we add $1/2$ times the first row to the second row. We do the same for the third row. This is equivalent to multiplying by the elementary matrix $L_1$:
$$
L_1 A
= \begin{bmatrix}
1 & 0 & 0 \\
\frac{1}{2} & 1 & 0 \\
\frac{1}{2} & 0 & 1
\end{bmatrix}
\begin{bmatrix}
2 & -6 & -2 & 4 \\
-1 & 3 & 3 & 2 \\
-1 & 3 & 7 & 10
\end{bmatrix}
= \begin{bmatrix}
2 & -6 & -2 & 4 \\
0 & 0 & 2 & 4 \\
0 & 0 & 6 & 12
\end{bmatrix}
$$
* Next, we eliminate the third entry in the second column by adding $−3$ times the second row to the third row. This is equivalent to multiplying by $L_2$:
$$
L_2 (L_1 A) =
\begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & -3 & 1
\end{bmatrix}
\begin{bmatrix}
2 & -6 & -2 & 4 \\
0 & 0 & 2 & 4 \\
0 & 0 & 6 & 12
\end{bmatrix}
= \begin{bmatrix}
2 & -6 & -2 & 4 \\
0 & 0 & 2 & 4 \\
0 & 0 & 0 & 0
\end{bmatrix}
$$
The resulting upper triangular matrix is $U=L_2 L_1 A$. Since $L_2 L_1 A = U$, we know that $A = L_1^{-1} L_2^{-1} U$.
The matrix $L$ is the product of the inverses of the elementary matrices:
$$
L = L_1^{-1} L_2^{-1} =
\begin{bmatrix}
1 & 0 & 0 \\
-\frac{1}{2} & 1 & 0 \\
-\frac{1}{2} & 0 & 1
\end{bmatrix}
\begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 3 & 1
\end{bmatrix}
= \begin{bmatrix}
1 & 0 & 0 \\
-\tfrac{1}{2} & 1 & 0 \\
-\tfrac{1}{2} & 3 & 1
\end{bmatrix}
$$
:::
:::danger
**Theorem** Suppose $A$ has an LU decomposition $A = LU$.
1. $\mathrm{rank}(A) = \mathrm{rank}(U)$
2. $\det(A) = \det(U) = u_{1,1} \cdots u_{m,m}$ where $u_{1,1} , \dots , u_{m,m}$ are the diagonal entries of $U$.
:::
:::success
**Exercise**
1. Consider the matrix
$$
A = \left[ \begin{array}{rrrr}
-3 & \phantom{+}1 & 2 & \phantom{+}0 \\
3 & 1 & -2 & 1 \\
-6 & 2 & 5 & 1 \\
-9 & 3 & 4 & 2
\end{array} \right]
$$
Find the $LU$ decomposition of $A$ and compute $\mathrm{det}(A)$.
2. If $A$ is of the form
$$
A =
\begin{bmatrix} * & * & 0 & 0 \\ * & * & * & 0 \\
0 & * & * & * \\
0 & 0 & * & *
\end{bmatrix}
$$
and has an $LU$ decomposition, then $L$ and $U$ are of the form
$$
L =
\begin{bmatrix}
1 & 0 & 0 & 0 \\ * & 1 & 0 & 0 \\
0 & * & 1 & 0 \\
0 & 0 & * & 1
\end{bmatrix}
\quad
U =
\begin{bmatrix}* & * & 0 & 0 \\
0 & * & * & 0 \\
0 & 0 & * & * \\
0 & 0 & 0 & *
\end{bmatrix}
$$
:::
## Forward and Backward Substitution
This method is most useful for solving systems of the form $A\mathbf{x}=\mathbf{b}$.
1. **Forward Substitution ($L\mathbf{y}=\mathbf{b}$):** Solve for $\mathbf{y}$ from the bottom up. This is a simple process because $L$ is a triangular matrix.
2. **Backward Substitution ($U\mathbf{x}=\mathbf{y}$):**
Once you have $\mathbf{y}$, solve for \mathbf{x} from the top down. Again, this is a straightforward process because $U$ is also a triangular matrix.
:::warning
**Example** Consider the system
$$
\begin{cases}
x_1 + 2x_2 + 2x_3 = 3 \\
4x_1 + 4x_2 + 2x_3 = 6 \\
4x_1 + 6x_2 + 4x_3 = 10
\end{cases}
$$
We can use the LU decomposition of $A$ from a previous calculation (not shown here to save space):
$$
L =
\begin{bmatrix}
1 & 0 & 0 \\
4 & 1 & 0 \\
4 & \tfrac{1}{2} & 1
\end{bmatrix}, \quad
U =
\begin{bmatrix}
1 & 2 & 2 \\
0 & -4 & -6 \\
0 & 0 & -1
\end{bmatrix}
$$
First, perform **forward substitution** on $L\mathbf{y}=\mathbf{b}$ to find $\mathbf{y}$:
\begin{align}
\underbrace{\begin{bmatrix}
1 & 0 & 0 \\
4 & 1 & 0 \\
4 & \tfrac{1}{2} & 1
\end{bmatrix}}_{L}
\underbrace{\begin{bmatrix}
y_1 \\ y_2 \\ y_3
\end{bmatrix}}_{\mathbf{y}}
=\underbrace{
\begin{bmatrix}
3 \\ 6 \\ 10
\end{bmatrix}}_{\mathbf{b}}
\end{align}
So, $\mathbf{y} = \begin{bmatrix}3 \\ -6 \\ 1\end{bmatrix}$.
Next, perform backward substitution on $U\mathbf{x}=\mathbf{y}$ to find $\mathbf{x}$:
$$
\underbrace{
\begin{bmatrix}
1 & 2 & 2 \\
0 & -4 & -6 \\
0 & 0 & -1
\end{bmatrix}}_{U}
\underbrace{
\begin{bmatrix}
x_1 \\ x_2 \\ x_3
\end{bmatrix}}_{\mathbf{x}}
= \underbrace{
\begin{bmatrix}
3 \\ -6 \\ 1
\end{bmatrix}}_{\mathbf{y}}
$$
The solution is $\mathbf{x} =
\begin{bmatrix}-1 \\ 3 \\ -1\end{bmatrix}$.
:::
### Finding the Inverse of a Matrix using LU Decomposition
Suppose an n×n matrix $A$ has an $LU$ decomposition, such that $A=LU$. If the inverse of $A$ exists, denoted as $A^{-1}$, we can find it by solving $n$ linear systems:
\begin{align}
AA^{-1} = I_n
&\Leftrightarrow A \left[\mathbf{x}_1,\ldots,\mathbf{x}_n\right]=I_n \\
&\Leftrightarrow A \mathbf{x}_i =\mathbf{e}_i \quad i=1, \dots, n \\
&\Leftrightarrow LU \mathbf{x}_i =\mathbf{e}_i \quad i=1, \dots, n
\end{align}
:::warning
**Example** Compute the inverse of the matrix $A$ from the previous example.
To compute the inverse, we solve the system $A \mathbf{x}_i = \mathbf{e}_i$ for each column $\mathbf{x}_i$ of the inverse, where $\mathbf{e}_i$ is the $i$-th column of the identity matrix, using the decomposition $A=LU$:
$$
L =
\begin{bmatrix}
1 & 0 & 0 \\
4 & 1 & 0 \\
4 & \tfrac{1}{2} & 1
\end{bmatrix}, \quad
U =
\begin{bmatrix}
1 & 2 & 2 \\
0 & -4 & -6 \\
0 & 0 & -1
\end{bmatrix}
$$
1. Solve $A\mathbf{x}_1 = \mathbf{e}_1$
\begin{align}
L \mathbf{y}_1 = \mathbf{e}_1 =
\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}
\;\; &\Longrightarrow\;\;
\mathbf{y}_1 =
\begin{bmatrix} 1 \\ -4 \\ -2 \end{bmatrix} \\
U \mathbf{x}_1 = \mathbf{y}_1 \;\; &\Longrightarrow\;\;
\mathbf{x}_1 =
\begin{bmatrix} 1 \\ -2 \\ 2 \end{bmatrix}
\end{align}
2. Solve $A\mathbf{x}_2 = \mathbf{e}_2$
\begin{align}
L \mathbf{y}_2 = \mathbf{e}_2 =
\begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}
\;\; &\Longrightarrow\;\;
\mathbf{y}_2 =
\begin{bmatrix} 0 \\ 1 \\ -\tfrac{1}{2} \end{bmatrix} \\
U \mathbf{x}_2 = \mathbf{y}_2 \;\; &\Longrightarrow\;\;
\mathbf{x}_2 =
\begin{bmatrix} 1 \\ -1 \\-\tfrac{1}{2} \end{bmatrix}
\end{align}
3. Solve $A\mathbf{x}_3 = e_3$
\begin{align}
L \mathbf{y}_3 = \mathbf{e}_3 =
\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}
\;\;\Longrightarrow\;\;
\mathbf{y}_3 =
\begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \\
U \mathbf{x}_3 = \mathbf{y}_3 \;\;\Longrightarrow\;\;
\mathbf{x}_3 =
\begin{bmatrix} -1 \\ \tfrac{3}{2} \\ 1 \end{bmatrix}
\end{align}
Therefore, the inverse matrix is:
$$
A^{-1} =
\begin{bmatrix}
1 & 0 & -1 \\
1 & -1 & \tfrac{3}{2} \\
2 & -\tfrac{1}{2} & 1
\end{bmatrix}
$$
:::
:::success
**Exercise**
3. Suppose Gaussian elimination without pivoting applied to a matrix $A$ produces the result
$$
L_3 L_2 L_1 A = U
$$
where
$$
L_1 =
\begin{bmatrix}
1 & 0 & 0 & 0 \\
1 & 1 & 0 & 0 \\
3 & 0 & 1 & 0 \\
2 & 0 & 0 & 1
\end{bmatrix},
\quad
L_2 =
\begin{bmatrix}
1 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 \\
0 & 2 & 1 & 0 \\
0 & 5 & 0 & 1
\end{bmatrix},
\quad
L_3 =
\begin{bmatrix}
1 & 0 & 0 & 0 \\
0 & 1 & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 4 & 1
\end{bmatrix}
$$
Determine the matrix $L$ in the decomposition $A = LU$.
:::