owned this note
owned this note
Published
Linked with GitHub
# 譜分解
Spectral decomposition

This work by Jephian Lin is licensed under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/).
$\newcommand{\trans}{^\top}
\newcommand{\adj}{^{\rm adj}}
\newcommand{\cof}{^{\rm cof}}
\newcommand{\inp}[2]{\left\langle#1,#2\right\rangle}
\newcommand{\dunion}{\mathbin{\dot\cup}}
\newcommand{\bzero}{\mathbf{0}}
\newcommand{\bone}{\mathbf{1}}
\newcommand{\ba}{\mathbf{a}}
\newcommand{\bb}{\mathbf{b}}
\newcommand{\bc}{\mathbf{c}}
\newcommand{\bd}{\mathbf{d}}
\newcommand{\be}{\mathbf{e}}
\newcommand{\bh}{\mathbf{h}}
\newcommand{\bp}{\mathbf{p}}
\newcommand{\bq}{\mathbf{q}}
\newcommand{\br}{\mathbf{r}}
\newcommand{\bx}{\mathbf{x}}
\newcommand{\by}{\mathbf{y}}
\newcommand{\bz}{\mathbf{z}}
\newcommand{\bu}{\mathbf{u}}
\newcommand{\bv}{\mathbf{v}}
\newcommand{\bw}{\mathbf{w}}
\newcommand{\tr}{\operatorname{tr}}
\newcommand{\nul}{\operatorname{null}}
\newcommand{\rank}{\operatorname{rank}}
%\newcommand{\ker}{\operatorname{ker}}
\newcommand{\range}{\operatorname{range}}
\newcommand{\Col}{\operatorname{Col}}
\newcommand{\Row}{\operatorname{Row}}
\newcommand{\spec}{\operatorname{spec}}
\newcommand{\vspan}{\operatorname{span}}
\newcommand{\Vol}{\operatorname{Vol}}
\newcommand{\sgn}{\operatorname{sgn}}
\newcommand{\idmap}{\operatorname{id}}
\newcommand{\am}{\operatorname{am}}
\newcommand{\gm}{\operatorname{gm}}
\newcommand{\mult}{\operatorname{mult}}
\newcommand{\iner}{\operatorname{iner}}$
```python
from lingeo import random_int_list
from linspace import QR
```
## Main idea
Continuing the introduction of the spectral decomposition in 313, this section will provide the theoretical foundation of it.
Let $A$ be an $n\times n$ real symmetric matrix.
Recall that the spectral theorem ensures the following equivalent properties.
- There is an orthogonal matrix $Q$ such that $Q\trans AQ$ is diagonal.
- There is an orthonormal basis $\{\bv_1,\ldots, \bv_n\}$ of $\mathbb{R}^n$ such that $A\bv_i = \lambda_i\bv_i$ for some $\lambda_i$ for each $i = 1,\ldots, n$.
Here $Q$ is the matrix whose columns are $\{\bv_1, \ldots, \bv_n\}$.
Equivalently, we may write
$$
A = QDQ^\top =
\begin{bmatrix}
| & ~ & | \\
{\bf v}_1 & \cdots & {\bf v}_n \\
| & ~ & |
\end{bmatrix}
\begin{bmatrix}
\lambda_1 & ~ & ~ \\
~ & \ddots & ~ \\
~ & ~ & \lambda_n \\
\end{bmatrix}
\begin{bmatrix}
- & {\bf v}_1^\top & - \\
~ & \vdots & ~\\
- & {\bf v}_n^\top & -
\end{bmatrix} =
\sum_{i = 1}^n \lambda_i {\bf v}_i{\bf v}_i^\top.
$$
Suppose $\{\lambda_1,\ldots,\lambda_n\}$ only has $q$ distinct values $\{\mu_1,\ldots, \mu_q\}$.
For each $j = 1,\ldots, q$, we may let $\displaystyle P_j = \sum_{\lambda_i = \mu_j} {\bf v}_i{\bf v}_i^\top$.
Thus, we have the following.
##### Spectral theorem (projection version)
Let $A$ be an $n\times n$ symmetric matrix.
Then there are $q$ distinct values $\mu_1,\ldots, \mu_q$ and $q$ projection matrices $P_1,\ldots, P_q$ such that
- $A = \sum_{j=1}^q \mu_j P_j$,
- $P_i^2 = P_i$ for any $i$,
- $P_iP_j = O$ for any $i$ and $j$, and
- $\sum_{j=1}^q P_j = I_n$.
## Side stories
- $P_i$ as a polynomial of $A$
- orthogonal projection matrix
- eigenvector-eigenvalue identity
## Experiments
##### Exercise 1
執行以下程式碼。
<!-- eng start -->
Run the code below.
<!-- eng end -->
```python
### code
set_random_seed(0)
print_ans = False
while True:
L = matrix(3, random_int_list(9, 2))
eigs = random_int_list(2)
if L.is_invertible() and eigs[0] != eigs[1]:
break
Q,R = QR(L)
for j in range(3):
v = Q[:,j]
length = sqrt((v.transpose() * v)[0,0])
Q[:,j] = v / length
eigs.append(eigs[-1])
D = diagonal_matrix(eigs)
A = Q * D * Q.transpose()
pretty_print(LatexExpr("A ="), A)
pretty_print(LatexExpr("A = Q D Q^{-1} ="), Q, D, Q.transpose())
if print_ans:
print("eigenvalues of A:", eigs)
print("eigenvectors of A = columns of Q")
pretty_print(LatexExpr("A ="),
eigs[0], Q[:,0]*Q[:,0].transpose(),
LatexExpr("+"),
eigs[1], Q[:,1:]*Q[:,1:].transpose())
```
##### Exercise 1(a)
求 $A$ 的所有特徵值及其對應的特徵向量。
<!-- eng start -->
Find the spectrum of $A$ and the corresponding eigenvectors.
<!-- eng end -->
Let `seed = 8`
$$
A = \begin{bmatrix}
5 & 0 & 0 \\
0 & -2 & 0\\
0 & 0 & 5 \\
\end{bmatrix}.
$$
$\spec(A) = \{-2,5,5\}$
The corresponding eigenvectors are
$$\begin{bmatrix}
0\\1 \\ 0 \\
\end{bmatrix}
\begin{bmatrix}
0 \\0 \\ 1 \\
\end{bmatrix}
\begin{bmatrix}
-1\\0\\0 \\
\end{bmatrix}
$$
##### Exercise 1(b)
求 $A$ 的譜分解。
<!-- eng start -->
Find the spectral decomposition of $A$.
<!-- eng end -->
Find the basis of $\ker(A - \lambda I)$ ,also the length of $\ker(A - \lambda I) = 1$.
$\lambda =-2$,
$A-\lambda I =$
\begin{bmatrix}
7 & 0 & 0 \\
0 & 0& 0\\
0 & 0 & 7\\
\end{bmatrix}
By obersvation:
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}
0\\1\\ 0 \\
\end{bmatrix}\right\} .$
$\lambda =5$,
$A-\lambda I =$
\begin{bmatrix}
0 & 0 & 0 \\
0 & -7&0\\
0 & 0 & 0\\
\end{bmatrix}
By obersvation:
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}
1\\0 \\ 0 \\
\end{bmatrix},\begin{bmatrix}
0\\0 \\ 1 \\
\end{bmatrix}
\right\} .$
We get
$S= \begin{bmatrix}
0 & 1 & 0 \\
1 & 0 & 0\\
0 & 0 & 1 \\
\end{bmatrix}.$
According to $\spec(A) = \{-2,5,5\}$
$$A = -2P_1 + 5P_2.
$$
Let $\bu_1,\bu_2,\bu_3$ be a vector of $S$,$P_1 = \bu_1\bu_1\trans$ ,$P_2 = \bu_2\bu_2\trans + \bu_3\bu_3\trans$
$P_1 = \begin{bmatrix}0 \\1\\0\\\end{bmatrix}\begin{bmatrix}0&1&0\\\end{bmatrix}= \begin{bmatrix}
0 & 0 & 0\\
0 & 1 & 0\\
0 & 0 & 0\\
\end{bmatrix}.$
$P_2= \begin{bmatrix}1 \\0\\0\\\end{bmatrix}\begin{bmatrix}1&0&0\\\end{bmatrix}+\begin{bmatrix}0\\0\\1\\\end{bmatrix}\begin{bmatrix}0 &0&1\\\end{bmatrix} = \begin{bmatrix}
1 &0& 0\\
0 &0 & 0\\
0 & 0 & 0\\
\end{bmatrix}+ \begin{bmatrix}
0&0& 0\\
0&0 & 0\\
0& 0&1\\
\end{bmatrix}=\begin{bmatrix}
1 &0 & 0\\
0 &0& 0\\
0& 0&1\\
\end{bmatrix}$
Thus
$A =-2\begin{bmatrix}
0 &0& 0\\
0 &1 & 0\\
0 & 0 & 0\\
\end{bmatrix} + 5\begin{bmatrix}
1 &0 & 0\\
0 &0& 0\\
0& 0&1\\
\end{bmatrix}.$
:::info
What do the experiments try to tell you? (open answer)
If A is a symmeric metrix, then i can use another form to represent metrix A by spectral theorem and spectral decomposition.
...
:::
## Exercises
##### Exercise 2
求以下矩陣的譜分解。
<!-- eng start -->
For each of the following matrices, find its spectral decomposition.
<!-- eng end -->
##### Exercise 2(a)
$$
A = \begin{bmatrix}
0 & 1 \\
1 & 0
\end{bmatrix}.
$$
According to $p_A(x) = \det(A-xI) = (x+1)(x-1)= 0$ ,we can find that
$$\spec(A) = \{1,-1\}$$Find the basis of $\ker(A - \lambda I)$ ,also the length of $\ker(A - \lambda I)$ =$1$.
$\lambda = 1$ ,
$A-\lambda I = \begin{bmatrix}
-1 & 1 \\
1 & -1
\end{bmatrix}$ ,by obersvation:
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}1\\1\end{bmatrix}\right\} = \vspan\left\{\begin{bmatrix}\frac{1}{\sqrt2}\\\frac{1}{\sqrt2}\end{bmatrix}\right\}.$
$\lambda = -1$ ,
$A-\lambda I = \begin{bmatrix}
1 & 1 \\
1 & 1
\end{bmatrix}$ ,by obersvation:
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}1\\-1\end{bmatrix}\right\} = \vspan\left\{\begin{bmatrix}\frac{1}{\sqrt2}\\\frac{-1}{\sqrt2}\end{bmatrix}\right\}.$
We get
$S= \begin{bmatrix}
\frac{1}{\sqrt2} & \frac{1}{\sqrt2} \\
\frac{1}{\sqrt2} & \frac{-1}{\sqrt2}
\end{bmatrix}.$
According to $\spec(A) = \{1,-1\}$ ,
$$A = 1P_1 + (-1)P_2.
$$
Let $\bu_1,\bu_2$ be a vector of $S$ ,$P_1 = \bu_1\bu_1\trans$ $P_2 = \bu_2\bu_2\trans$.
$P_1 = \begin{bmatrix}\frac{1}{\sqrt2} \\\frac{1}{\sqrt2}\\\end{bmatrix}$$\begin{bmatrix}\frac{1}{\sqrt2} &\frac{1}{\sqrt2}\\\end{bmatrix}= \begin{bmatrix}\frac{1}{2} & \frac{1}{2}\\\frac{1}{2} & \frac{1}{2}
\end{bmatrix}.$
$P_2 = \begin{bmatrix}\frac{1}{\sqrt2} \\\frac{-1}{\sqrt2}\\\end{bmatrix}$$\begin{bmatrix}\frac{1}{\sqrt2} &\frac{-1}{\sqrt2}\\\end{bmatrix}=\begin{bmatrix}\frac{1}{2} & \frac{-1}{2}\\\frac{-1}{2} & \frac{1}{2}\end{bmatrix}.$
Thus,
$$A = 1\begin{bmatrix}\frac{1}{2} & \frac{1}{2}\\\frac{1}{2} & \frac{1}{2}\end{bmatrix} + (-1)\begin{bmatrix}\frac{1}{2} & \frac{-1}{2}\\\frac{-1}{2} & \frac{1}{2}\end{bmatrix}=\begin{bmatrix}
0 & 1 \\
1 & 0\end{bmatrix}.$$
##### Exercise 2(b)
$$
A = \begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1 \\
\end{bmatrix}.
$$
##### Answer for 2(b):
Let's find the eigenvalues of A first,\
The characteristic polynomial of A is:\
$s_0 = 1$\
$s_1 = \tr(A) = 1 + 1 +1 = 3$\
$s_2 =$ sum of the principal minors $= 0 + 0 +0 = 0$\
$s_3 = \det(A) = 0 + 0 + 0 = 0$$
\begin{aligned}
p_A(x) &= s_0(-x)^3 + s_1(-x)^2 + s_2(-x) + s_3\\
&= 1(-x)^3 - 3(-x)^2 + 0(-x) + 0\\
&= x^2(x-3)
\end{aligned}
$\spec(A) = \{0, 0,3\}$
Find the basis of $\ker(A - \lambda I)$ ,also the length of $\ker(A - \lambda I)$ =$1$.
When $\lambda = 0$,
$$
(A-\lambda I) = \begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1
\end{bmatrix}.
$$
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}-1\\1\\0\end{bmatrix},\begin{bmatrix}1\\1\\-2\end{bmatrix}\right\} =
\vspan\left\{\begin{bmatrix}\frac{1}{\sqrt2}\\\frac{-1}{\sqrt2}\\\frac{0}{\sqrt2}\end{bmatrix},\begin{bmatrix}\frac{1}{\sqrt6}\\\frac{1}{\sqrt6}\\\frac{-2}{\sqrt6}\end{bmatrix}\right\}.$
When $\lambda = 3$,
$$
(A-\lambda I) = \begin{bmatrix}
-2 & 1 & 1 \\
1 & -2 & 1 \\
1 & 1 & -2
\end{bmatrix}.
$$
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}1\\1\\1\end{bmatrix}\right\} =
\vspan\left\{\begin{bmatrix}\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\end{bmatrix}\right\}.$
We get S:
$$
S = \begin{bmatrix}
\frac{-1}{\sqrt2} & \frac{-1}{\sqrt6} &\frac{1}{\sqrt3} \\
\frac{1}{\sqrt2}& \frac{-1}{\sqrt6} & \frac{1}{\sqrt3} \\
\frac{0}{\sqrt2} & \frac{2}{\sqrt6} & \frac{1}{\sqrt3} \\
\end{bmatrix}.
$$
According to $\spec(A) = \{0,0,3\}$ ,
$$A = 0P_1 + 3P_2.$$
$P_1 =
\begin{bmatrix}\frac{-1}{\sqrt2} \\\frac{1}{\sqrt2}\\0\\\end{bmatrix}$$\begin{bmatrix}\frac{-1}{\sqrt2} &\frac{1}{\sqrt2}&0\\\end{bmatrix} +
\begin{bmatrix}\frac{-1}{\sqrt6} \\\frac{-1}{\sqrt6}\\\frac{2}{\sqrt6}\\\end{bmatrix}$$\begin{bmatrix}\frac{-1}{\sqrt6}&\frac{-1}{\sqrt6}&\frac{2}{\sqrt6}\\\end{bmatrix}=
\begin{bmatrix}
\frac{2}{3} & \frac{-1}{3} &\frac{-1}{3} \\
\frac{-1}{3}& \frac{2}{3} & \frac{-1}{3} \\
\frac{-1}{3} & \frac{-1}{3} & \frac{2}{3} \\
\end{bmatrix}.$
$P_2 =
\begin{bmatrix}\frac{1}{\sqrt3} \\\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}$$\begin{bmatrix}\frac{1}{\sqrt3} &\frac{1}{\sqrt3}&\frac{1}{\sqrt3}\\\end{bmatrix}=
\begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\end{bmatrix}.$
Now we can confirm that $$A = 0P_1 + 3P_2.$$
$$A =
0\begin{bmatrix}
\frac{2}{3} & \frac{-1}{3} &\frac{-1}{3} \\
\frac{-1}{3}& \frac{2}{3} & \frac{-1}{3} \\
\frac{-1}{3} & \frac{-1}{3} & \frac{2}{3} \\
\end{bmatrix}+
3\begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\
\end{bmatrix}=
\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 1 \\
1 & 1 & 1 \\
\end{bmatrix}.
$$
:::warning
:warning: Remember to make the two eigenvectors for $\lambda = 0$ orthonormal to each other.
:::
##### Exercise 2(c)
$$
A = \begin{bmatrix}
2 & -1 & -1 \\
-1 & 2 & -1 \\
-1 & -1 & 2
\end{bmatrix}.
$$
**[由蔡睿丞提供]**
According to
$p_A(x) = \det(A-xI) = -x(x-3)^2$
$\spec(A) = \{0, 0,3\}$
When $\lambda = 3$,
$$
(A-\lambda I) = \begin{bmatrix}
-1 & -1 & -1 \\
-1 & -1 & -1 \\
-1 & -1 & -1
\end{bmatrix}.
$$
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}-1\\1\\0\end{bmatrix}、\begin{bmatrix}1\\1\\-2\end{bmatrix}\right\}= \vspan\left\{\begin{bmatrix}\frac{-1}{\sqrt2} \\\frac{1}{\sqrt2}\\0\\\end{bmatrix} 、\begin{bmatrix}\frac{1}{\sqrt6} \\\frac{1}{\sqrt6}\\ \frac{-2}{\sqrt6}\\\end{bmatrix}\right\}.$
When $\lambda = 0$,
$$
(A-\lambda I) = \begin{bmatrix}
2 & -1 & -1 \\
-1 & 2 & -1 \\
-1 & -1 & 2
\end{bmatrix}.
$$
$\ker(A-\lambda I) = \vspan\left\{\begin{bmatrix}1\\1\\1\end{bmatrix}\right\}=\vspan\left\{\begin{bmatrix}\frac{1}{\sqrt3} \\\frac{1}{\sqrt3}\\ \frac{1}{\sqrt3}\\\end{bmatrix}\right\}.$
We get S:
$$
\begin{bmatrix}
\frac{1}{\sqrt3} & \frac{-1}{\sqrt2} & \frac{1}{\sqrt6} \\
\frac{1}{\sqrt3}& \frac{1}{\sqrt2}& \frac{1}{\sqrt6} \\
\frac{1}{\sqrt3} & 0 & \frac{-2}{\sqrt6} \\
\end{bmatrix}.
$$
According to $\spec(A) = \{0,3,3\}$ ,
$$A = 0P_1 + 3P_2.
$$
$P_1 = \begin{bmatrix}\frac{1}{\sqrt3} \\\frac{1}{\sqrt3}\\\frac{1}{\sqrt3}\\\end{bmatrix}\begin{bmatrix}\frac{1}{\sqrt3} &\frac{1}{\sqrt3}&\frac{1}{\sqrt3}\\\end{bmatrix}= \begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\end{bmatrix}.$
$P_2 = \begin{bmatrix}\frac{-1}{\sqrt2} \\\frac{1}{\sqrt2}\\0\\\end{bmatrix}\begin{bmatrix}\frac{-1}{\sqrt2} &\frac{1}{\sqrt2}&0\\\end{bmatrix}+\begin{bmatrix}\frac{1}{\sqrt6} \\\frac{1}{\sqrt6}\\\frac{-2}{\sqrt6}\\\end{bmatrix}\begin{bmatrix}\frac{1}{\sqrt6} &\frac{1}{\sqrt6}&\frac{-2}{\sqrt6}\\\end{bmatrix} = \begin{bmatrix}
\frac{1}{2} &\frac{-1}{2} & 0\\
\frac{-1}{2} &\frac{1}{2} & 0\\
0 & 0 & 0\\
\end{bmatrix}+ \begin{bmatrix}
\frac{1}{6} &\frac{1}{6} & -\frac{1}{3}\\
\frac{1}{6} &\frac{1}{6} & -\frac{1}{3}\\
-\frac{1}{3}& -\frac{1}{3}& \frac{2}{3}\\
\end{bmatrix}=\begin{bmatrix}
\frac{2}{3} &-\frac{1}{3} &-\frac{1}{3}\\
-\frac{1}{3} &\frac{2}{3}& -\frac{1}{3}\\
-\frac{1}{3}& -\frac{1}{3}& \frac{2}{3}\\
\end{bmatrix}$
Thus, $A = 0\begin{bmatrix}
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
\end{bmatrix} + 3\begin{bmatrix}
\frac{2}{3} &-\frac{1}{3} &-\frac{1}{3}\\
-\frac{1}{3} &\frac{2}{3}& -\frac{1}{3}\\
-\frac{1}{3}& -\frac{1}{3}& \frac{2}{3}\\
\end{bmatrix}=\begin{bmatrix}
2 & -1 & -1 \\
-1 & 2 & -1 \\
-1 & -1 & 2 \\
\end{bmatrix}.$
##### Exercise 3
令 $\bu$ 為一長度為 $1$ 的實向量。
令 $P = \bu\bu\trans$。
<!-- eng start -->
Let $\bu$ be a real vector of length $1$. Let $P = \bu\bu\trans$.
<!-- eng end -->
##### Exercise 3(a)
說明 $P$ 為垂直投影到 $\vspan\{\bu\}$ 的投影矩陣。
<!-- eng start -->
Explain why $P$ is the projection matrix onto $\vspan\{\bu\}$.
<!-- eng end -->
##### Answer for Exercise 3(a):
In order to explain why $P$ is the projection matrix onto $\vspan\{\bu\}$,\
we need to prove that a vector projected through $P$ lays on the $\vspan\{\bu\}$, and that $P$ is idempotent, as a projection matrix should be.
:::warning
It is okay. But it would be easier to think about the projection formula
$$
\frac{\inp{\bu}{\bv}}{\|\bu\|^2} \bu.
$$
:::
##### Exercise 3(b)
證明 $\tr(P\trans P) = \rank(P) = 1$。
<!-- eng start -->
Show that $\tr(P\trans P) = \rank(P) = 1$.
<!-- eng end -->
##### Exercise 4
令 $\{\bu_1, \ldots, \bu_d\}$ 為一群互相垂直且長度均為 $1$ 的實向量。
令 $P = \bu_1\bu_1\trans + \cdots + \bu_d\bu_d\trans$。
<!-- eng start -->
Let $\{\bu_1, \ldots, \bu_d\}$ be an orthonormal set of real vectors. Let $P = \bu_1\bu_1\trans + \cdots + \bu_d\bu_d\trans$.
<!-- eng end -->
##### Exercise 4(a)
說明 $P$ 為垂直投影到 $\vspan\{\bu_1,\ldots, \bu_d\}$ 的投影矩陣。
<!-- eng start -->
Explain why $P$ is the projection matrix onto $\vspan\{\bu_1,\ldots, \bu_d\}$.
<!-- eng end -->
##### Exercise 4(b)
證明 $\tr(P\trans P) = \rank(P) = d$。
<!-- eng start -->
Show that $\tr(P\trans P) = \rank(P) = d$.
<!-- eng end -->
##### Exercise 5
一個 **垂直投影矩陣** 指的是一個可以被垂直矩陣對角化且特徵值均是 $1$ 或 $0$ 的矩陣。
令 $P$ 為一實方陣。
證明以下敘述等價:
1. $P$ 為一垂直投影矩陣。
2. $P$ 是對稱矩陣,且 $P^2 = P$。
<!-- eng start -->
An **orthogonal projection matrix** is a matrix such that it can be diagonalized by a real orthogonal matrix and has all its eigenvalues equal to $1$ or $0$.
Let $P$ be a real square matrix. Show that the following are equivalent:
1. $P$ is an orthogonal projection matrix.
2. $P$ is symmetric and $P^2 = P$.
<!-- eng end -->
##### Exercise 6
雖然譜分解裡的條件沒有明顯說明 $P_i$ 是垂直投影矩陣,
依照以下步驟證明下列條件
1. $A = \sum_{j=1}^q \mu_j P_j$,
2. $P_i^2 = P_i$ for any $i$,
3. $P_iP_j = O$ for any $i$ and $j$, and
4. $\sum_{j=1}^q P_j = I_n$.
足以說明每一個 $P_i$ 都是垂直投影矩陣。
<!-- eng start -->
Although the statements in the spectral decomposition do not explicitly mention that $P_i$ is an orthogonal projection matrix. Use the given instructions to show that the conditions
1. $A = \sum_{j=1}^q \mu_j P_j$,
2. $P_i^2 = P_i$ for any $i$,
3. $P_iP_j = O$ for any $i$ and $j$, and
4. $\sum_{j=1}^q P_j = I_n$.
are enough to guarantee $P_i$ is an orthogonal projection matrix.
<!-- eng end -->
##### Exercise 6(a)
驗證
$$
\begin{aligned}
I &= P_1 + \cdots + P_q, \\
A &= \mu_1 P_1 + \cdots + \mu_q P_q, \\
A^2 &= \mu_1^2 P_1 + \cdots + \mu_q^2 P_q, \\
~ & \vdots \\
A^{q-1} &= \mu_1^{q-1} P_1 + \cdots + \mu_q^{q-1} P_q.
\end{aligned}
$$
並利用拉格朗日多項式來說明對每一個 $i = 1,\ldots, q$ 來說,
都找得到一些係數 $c_0,\ldots,c_{q-1}$ 使得 $P_i = c_0 I + c_1 A + \cdots + c_{q-1} A^{q-1}$。
因此每一個 $P_i$ 都是對稱矩陣。
<!-- eng start -->
Verify that
$$
\begin{aligned}
I &= P_1 + \cdots + P_q, \\
A &= \mu_1 P_1 + \cdots + \mu_q P_q, \\
A^2 &= \mu_1^2 P_1 + \cdots + \mu_q^2 P_q, \\
~ & \vdots \\
A^{q-1} &= \mu_1^{q-1} P_1 + \cdots + \mu_q^{q-1} P_q.
\end{aligned}
$$
Then use the Lagrange polynomials to show that for each $i = 1,\ldots, q$, there are some coefficients $c_0,\ldots,c_{q-1}$ such that $P_i = c_0 I + c_1 A + \cdots + c_{q-1} A^{q-1}$. Therefore, each $P_i$ is a symmetric matrix.
<!-- eng end -->
##### Exercise 6(b)
說明每一個 $P_i$ 都是垂直投影矩陣。
<!-- eng start -->
Show that each $P_i$ is an orthogonal projection matrix.
<!-- eng end -->
##### Exercise 7
依照以下步驟證明下述定理。
##### Eigenvector-eigenvalue identity
若 $A$ 為一 $n\times n$ 實對稱矩陣。
其特徵值為 $\lambda_1,\ldots,\lambda_n$ 且某一個 $\lambda_i$ 只出現一次沒有重覆。
令 $\bv_1,\ldots, \bv_n$ 為其相對應的特徵向量,且其形成一垂直標準基。
則
$$
(A - \lambda_i I)\adj = \left(\prod_{j\neq i}(\lambda_j - \lambda_i)\right)\bv_i\bv_i\trans.
$$
<!-- eng start -->
Use the given instructions to prove the following theorem.
##### Eigenvector-eigenvalue identity
Let $A$ be an $n\times n$ real symmetric matrix such that $\lambda_1,\ldots,\lambda_n$ are its eigenvalues and $\lambda_i$ is a simple eigenvalue. Let $\bv_1,\ldots, \bv_n$ be the corresponding eigenvectors, which form an orthogonal basis. Then
$$
(A - \lambda_i I)\adj = \left(\prod_{j\neq i}(\lambda_j - \lambda_i)\right)\bv_i\bv_i\trans.
$$
<!-- eng end -->
##### Exercise 7(a)
說明當 $x$ 不為 $A$ 的特徵值時,
$$
\begin{aligned}
(A - xI)\adj &= \det(A - xI) \times \sum_{j = 1}^n (\lambda_j - x)^{-1}\bv_j\bv_j\trans \\
&= \sum_{j = 1}^n p_i(x) \bv_j\bv_j\trans,
\end{aligned}
$$
其中
$$
p_j(x) = \prod_{k \neq j}(\lambda_k - x).
$$
<!-- eng start -->
Show that when $x$ is not an eigenvalue of $A$, we have
$$
\begin{aligned}
(A - xI)\adj &= \det(A - xI) \times \sum_{j = 1}^n (\lambda_j - x)^{-1}\bv_j\bv_j\trans \\
&= \sum_{j = 1}^n p_i(x) \bv_j\bv_j\trans,
\end{aligned}
$$
where
$$
p_j(x) = \prod_{k \neq j}(\lambda_k - x).
$$
<!-- eng end -->
##### Exercise 7(b)
將 $x$ 趨近到 $\lambda_i$ 並證明特徵向量-特徵值定理。
<!-- eng start -->
Prove the eigenvector-eigenvalue identity by letting $x$ go to $\lambda_i$.
<!-- eng end -->
:::info
collaboration: 2
3 problems: 3
- 2a, 2b, 3a
moderator: 0
qc: 1
:::