owned this note
owned this note
Published
Linked with GitHub
# 對一個特徵向量化簡
Reduction

This work by Jephian Lin is licensed under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/).
$\newcommand{\trans}{^\top}
\newcommand{\adj}{^{\rm adj}}
\newcommand{\cof}{^{\rm cof}}
\newcommand{\inp}[2]{\left\langle#1,#2\right\rangle}
\newcommand{\dunion}{\mathbin{\dot\cup}}
\newcommand{\bzero}{\mathbf{0}}
\newcommand{\bone}{\mathbf{1}}
\newcommand{\ba}{\mathbf{a}}
\newcommand{\bb}{\mathbf{b}}
\newcommand{\bc}{\mathbf{c}}
\newcommand{\bd}{\mathbf{d}}
\newcommand{\be}{\mathbf{e}}
\newcommand{\bh}{\mathbf{h}}
\newcommand{\bp}{\mathbf{p}}
\newcommand{\bq}{\mathbf{q}}
\newcommand{\br}{\mathbf{r}}
\newcommand{\bx}{\mathbf{x}}
\newcommand{\by}{\mathbf{y}}
\newcommand{\bz}{\mathbf{z}}
\newcommand{\bu}{\mathbf{u}}
\newcommand{\bv}{\mathbf{v}}
\newcommand{\bw}{\mathbf{w}}
\newcommand{\tr}{\operatorname{tr}}
\newcommand{\nul}{\operatorname{null}}
\newcommand{\rank}{\operatorname{rank}}
%\newcommand{\ker}{\operatorname{ker}}
\newcommand{\range}{\operatorname{range}}
\newcommand{\Col}{\operatorname{Col}}
\newcommand{\Row}{\operatorname{Row}}
\newcommand{\spec}{\operatorname{spec}}
\newcommand{\vspan}{\operatorname{span}}
\newcommand{\Vol}{\operatorname{Vol}}
\newcommand{\sgn}{\operatorname{sgn}}
\newcommand{\idmap}{\operatorname{id}}
\newcommand{\am}{\operatorname{am}}
\newcommand{\gm}{\operatorname{gm}}
\newcommand{\mult}{\operatorname{mult}}
\newcommand{\iner}{\operatorname{iner}}$
```python
from lingeo import random_int_vec
```
## Main idea
We start with some basic ideas on complex matrices.
Let $A$ be a complex matrix.
Then the **conjugate transpose** of $A$ is the matirx obtained from $A\trans$ by taking conjugate entrywisely.
Recall that if $\bx$ and $\by$ are complex column vectors, then their inner product is $\inp{\bx}{\by} = \by^* \bx$.
If a complex matrix $A$ satisfies $A^* A = AA^* = I$, then it is called a **unitary** matrix.
In comparison, if a real matrix satisfies $A\trans A = AA\trans = I$, then it is an orthogonal matrix.
Let $A$ be an $n\times n$ complex matrix.
Then the following are equivalent:
- $A$ is a unitary matrix.
- $A^{-1} = A^*$.
- The columns of $A$ form an orthonormal basis of $\mathbb{C}^n$.
- The rows of $A$ form an orthonormal basis of $\mathbb{C}^n$.
Let $\bv\in\mathbb{C}^n$ be a nonzero vector.
Then one may expand $\bv$ into a basis $\beta$ of $\mathbb{C}^n$ whose first vector is $\bv$.
Let $Q$ be the matrix whose columns are the vectors in $\beta$. Then $Q$ is an invertible matrix whose first column is $\bv$.
If necessary, one may apply the Gram–Schimdt process to obtain an orthonormal basis of $\mathbb{C}^n$ whose first vector is $\frac{\bv}{\|\bv\|}$.
Thus, there is a unitary matrix $Q$ whose first column is $\frac{\bv}{\|\bv\|}$.
##### Reduction lemma
Let $A$ be a complex matrix.
Suppose $\bv$ is an eigenvector of $A$ with respect to the eigenvalue $\lambda$.
Let $Q$ be an invertible matrix whose first column is $\bv$.
Then $Q^{-1}AQ$ has the form
$$
\begin{bmatrix}
\lambda & * \\
\bzero & A_2
\end{bmatrix}.
$$
Moreover, $Q$ can be chosen as a unitary matrix whose first column is $\frac{\bv}{\|\bv\|}$.
##### Remark
Note that the eigenvalues of a real matrix are not necessarily all real.
Suppose $A$ is a real matrix and $\lambda$ is a real eigenvalue of $A$.
Then the eigenvector $\bv\in\ker(A - \lambda I)$ can be chosen to be real.
Also, the $Q$ matrix in the reduction lemma can be chosen to be orthogonal.
However, $A_2$ can still possibly have a non-real eigenvalue.
## Side stories
- all-ones vector
- cases of real matrices
- properties of unitary/orthogonal matrices
- discrete Fourier transform matrix
## Experiments
##### Exercise 1
執行以下程式碼。
令 $\beta = \{\bu_1,\ldots,\bu_n\}$ 為 $Q$ 的行向量集合。
<!-- eng start -->
Run the code below. Let $\beta = \{\bu_1,\ldots,\bu_n\}$ be the columns of $Q$.
<!-- eng end -->
```python
### code
set_random_seed(0)
print_ans = False
n = 4
Q = identity_matrix(n)
Q[1:,0] = random_int_vec(n-1, 3)
D = matrix(n, random_int_vec(n**2,3))
D[1:,0] = vector([0] * (n-1))
A = Q * D * Q.inverse()
print("n =", n)
pretty_print(LatexExpr("A ="), A)
pretty_print(LatexExpr("Q ="), Q)
if print_ans:
print("The representation of f_A(u1) with respect to beta is")
pretty_print(D[:,0])
pretty_print(LatexExpr("Q^{-1} ="), Q.inverse())
pretty_print(LatexExpr("Q^{-1} A Q ="), Q.inverse() * A * Q)
```
When `seed = 0`, we can get matrices
$$
A = \begin{bmatrix}
3 & -3 & -3 & -1\\
-1 & 10 & 7 & 4\\
23 & -8 & -12 & -5\\
16 & -2 & -6 & -2\\
\end{bmatrix},
Q = \begin{bmatrix}
1 & 0 & 0 & 0\\
-3 & 1 & 0 & 0\\
3 & 0 & 1 & 0\\
1 & 0 & 0 & 1\\
\end{bmatrix}.
$$
##### Exercise 1(a)
求 $[f_A(\bu_1)]_\beta$。
<!-- eng start -->
Find $[f_A(\bu_1)]_\beta$.
<!-- eng end -->
<font color="f000">Ans:</font>
$f_A(\bu_1)=A\cdot\bu_1=
\begin{bmatrix}
3 & -3 & -3 & -1\\
-1 & 10 & 7 & 4\\
23 & -8 & -12 & -5\\
16 & -2 & -6 & -2\\
\end{bmatrix}\cdot\begin{bmatrix}
1\\
-3\\
3\\
1\\
\end{bmatrix}$
$=\begin{bmatrix}
2\\
-6\\
6\\
2\\
\end{bmatrix}$.
And we can know that $[f_A(\bu_1)]_\beta$ is equal to $2\cdot\bu_1+0\cdot\bu_2+0\cdot\bu_3+0\cdot\bu_4$, so
$[f_A(\bu_1)]_\beta = \begin{bmatrix}
2\\
0\\
0\\
0\\
\end{bmatrix}$.
---
##### Exercise 1(b)
求 $Q^{-1}$。
<!-- eng start -->
Find $Q^{-1}$.
<!-- eng end -->
<font color="f000">Ans:</font>
By calculating the reduced echelon form of$$ \left[\begin{array}{rrrr|rrrrr}
1 & 0 & 0 & 0 & 1 & 0 & 0 & 0 \\
-3 & 1 & 0 & 0 & 0 & 1 & 0 & 0 \\
3 & 0 & 1 & 0 & 0 & 0 & 1 & 0 \\
1 & 0 & 0 & 1 & 0 & 0 & 0 & 1 \\
\end{array}\right]$$
we can get $Q^{-1}$=
$\begin{bmatrix}
1 & 0 & 0 & 0\\
3 & 1 & 0 & 0\\
-3 & 0 & 1 & 0\\
-1 & 0 & 0 & 1\\
\end{bmatrix}$ .
---
##### Exercise 1(c)
求 $[f_A]_\beta^\beta$.
<!-- eng start -->
Find $[f_A]_\beta^\beta$.
<!-- eng end -->
<font color="f000">Ans:</font>
We can get $Q$ and $Q^{-1}$ respectively from the code and Exercise 1(b) above.
$[f_A]_\beta^\beta = Q^{-1}AQ=$
$\begin{bmatrix}
1 & 0 & 0 & 0\\
3 & 1 & 0 & 0\\
-3 & 0 & 1 & 0\\
-1 & 0 & 0 & 1\\
\end{bmatrix}\begin{bmatrix}
3 & -3 & -3 & -1\\
-1 & 10 & 7 & 4\\
23 & -8 & -12 & -5\\
16 & -2 & -6 & -2\\
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 & 0\\
-3 & 1 & 0 & 0\\
3 & 0 & 1 & 0\\
1 & 0 & 0 & 1\\
\end{bmatrix} = \begin{bmatrix}
2 & -3 & -3 & -1\\
0 & 1 & -2 & 1\\
0 & 1 & -3 & -2\\
0 & 1 & -3 & -1\\
\end{bmatrix}.$
:::info
What do the experiments try to tell you? (open answer)
...
:::
---
## Exercises
##### Exercise 2
令 $\bone$ 為全一向量。
(其長度將由文意決定。)
已知以下矩陣 $A$ 皆有 $\bone$ 這個特徵向量。
求出 $A$ 的所有特徵值。
<!-- eng start -->
Let $\bone$ be the all-ones vector (whose dimension will be clear by the context). It is known that each of the following matrices has $\bone$ as an eigenvector. Find all eigenvalues of $A$.
<!-- eng end -->
##### Exercise 2(a)
$$
A = \begin{bmatrix}
0 & 1 & 1 \\
1 & 0 & 1 \\
1 & 1 & 0
\end{bmatrix}.
$$
<font color="f000">Ans:</font>
We can get the eigenvalues of $A$ by solving the characteristic equation:
$det(A - λI) = 0$.
$$
A-λI = \begin{bmatrix}
-λ & 1 & 1 \\
1 & -λ & 1 \\
1 & 1 & -λ
\end{bmatrix},
$$
so the characteristic polynomial of $A$ is
${-λ^3 + 3λ^2 + 2 =(λ+1)^2(-λ+2) = 0}$,
and the eigenvalues are $\{ -1,-1,2\}$.
---
##### Exercise 2(b)
$$
A = \begin{bmatrix}
0 & 1 & 1 & 1 \\
1 & 0 & 1 & 1 \\
1 & 1 & 0 & 1 \\
1 & 1 & 1 & 0
\end{bmatrix}.
$$
<font color="f000">Ans:</font> Let $\bv_1 = \begin{bmatrix}
1 \\ 1 \\ 1 \\ 1 \\
\end{bmatrix}$,
Expand $\bv_1$ into the basis of $\mathbb{R}^{4}$ $\beta = \{\bv_1,\bv_2,\bv_3,\bv_4\}$, and let it as the basis of $Q$ row vector, we can get
$$
Q = \begin{bmatrix}
1 & 0 & 0 & 0\\
1 & 1 & 0 & 0\\
1 & 0 & 1 & 0\\
1 & 0 & 0 & 1
\end{bmatrix} ,
$$
so $$
[f_A]_\beta^\beta = Q^{-1}AQ = \begin{bmatrix}
1 & 0 & 0 & 0\\
-1 & 1 & 0 & 0\\
-1 & 0 & 1 & 0\\
-1 & 0 & 0 & 1\\
\end{bmatrix}\begin{bmatrix}
0 & 1 & 1 & 1 \\
1 & 0 & 1 & 1 \\
1 & 1 & 0 & 1 \\
1 & 1 & 1 & 0
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 & 0\\
1 & 1 & 0 & 0\\
1 & 0 & 1 & 0\\
1 & 0 & 0 & 1\\
\end{bmatrix} = \begin{bmatrix}
3 & 1 & 1 & 1\\
0 & -1 & 0 & 0\\
0 & 0 & -1 & 0\\
0 & 0 & 0 & -1\\
\end{bmatrix}.
$$
We can know that there is an eigenvalue from the above formula $\lambda_1 = 3$.
And let $A_2 = \begin{bmatrix}
-1 & 0 & 0 \\
0 & -1 & 0\\
0 & 0 & -1
\end{bmatrix}$, so $\spec(A_2) = \{-1,-1,-1\}$.
We can know $\spec(A) =\{3\}\cup\{-1,-1,-1\} = \{-1,-1,-1,3\}$.
---
##### Exercise 2(c)
$$
A = \begin{bmatrix}
1 & -1 & 0 \\
-1 & 2 & -1 \\
0 & -1 & 1
\end{bmatrix}.
$$
<font color="f000">Ans:</font> let $\bv_1 = \begin{bmatrix}
1 \\ 1 \\ 1
\end{bmatrix}$,
Expand $\bv_1$ into $\mathbb{R}^{3}$ basis $\beta = \{\bv_1,\bv_2,\bv_3\}$, and let it as a row vector of $Q$, we can get
$$
Q = \begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1
\end{bmatrix},
$$
so
$[f_{A}]_\beta^\beta = Q^{-1}AQ = \begin{bmatrix}
1 & 0 & 0 \\
-1 & 1 & 0 \\
-1 & 0 & 1 \\
\end{bmatrix}\begin{bmatrix}
1 & -1 & 0 \\
-1 & 2 & -1 \\
0 & -1 & 1 \\
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1 \\
\end{bmatrix} = \begin{bmatrix}
0 & -1 & 0\\
0 & 3 & -1 \\
0 & 0 & 1
\end{bmatrix}.$
We can know that there is an eigenvalue from the above formula $\lambda = 0$.
Let matrix
$A_2 = \begin{bmatrix}
3 & -1 \\
0 & 1 \\
\end{bmatrix}$,so $\spec(A_2) = \{1,3\}$.
We can know $\spec(A) =\{0\}\cup\{1,3\} = \{0,1,3\}$.
---
##### Exercise 2(d)
$$
A = \begin{bmatrix}
0.2 & 0.8 & 0 \\
0.4 & 0.2 & 0.4 \\
0 & 0.8 & 0.2
\end{bmatrix}.
$$
<font color="f000">Ans:</font>
Let $\bv_1 = \begin{bmatrix}
1 \\ 1 \\ 1
\end{bmatrix}$,
Expand $\bv_1$ to the basis of $\mathbb{R}^{3}$ $\beta = \{\bv_1,\bv_2,\bv_3\}$ , and let it as a row vector of $Q$, we can get
$$Q = \begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1
\end{bmatrix} ,
$$
so $[f_{A}]_\beta^\beta = Q^{-1}AQ = \begin{bmatrix}
1 & 0 & 0 \\
-1 & 1 & 0 \\
-1 & 0 & 1 \\
\end{bmatrix}\begin{bmatrix}
0.2 & 0.8 & 0 \\
0.4 & 0.2 & 0.4 \\
0 & 0.8 & 0.2
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1 \\
\end{bmatrix} = \begin{bmatrix}
1 & 0.8 & 0\\
0 & -0.6 & 0.4 \\
0 & 0 & 0.2
\end{bmatrix}.$
We can know that there is an eigenvalue from the above formula $\lambda = 1$.
Let matrix $A_2 = \begin{bmatrix}
-0.6 & 0.4 \\
0 & 0.2 \\
\end{bmatrix}$,so $\spec(A_2) = \{-0.6,0.2\}$.
We can know $\spec(A) =\{1\}\cup\{-0.6,0.2\} = \{1,-0.6,0.2\}$.
---
##### Exercise 3
令
$$
A = \begin{bmatrix}
0 & 1 & 1 & 1 \\
1 & 2 & 0 & 0 \\
1 & 0 & 2 & 0 \\
1 & 0 & 0 & 2 \\
\end{bmatrix}.
$$
已知 $\bone$ 為 $A$ 的一特徵值。
求 $A$ 的所有特徵值。
提示:將 $A$ 對 $\bone$ 化簡後,再對 $A_2$ 化簡一次。
<!-- eng start -->
Let
$$
A = \begin{bmatrix}
0 & 1 & 1 & 1 \\
1 & 2 & 0 & 0 \\
1 & 0 & 2 & 0 \\
1 & 0 & 0 & 2 \\
\end{bmatrix}.
$$
It is known that $\bone$ is an eigenvector of $A$. Find all eigenvalues of $A$.
Hint: Apply the reduction lemma to $A$ and $\bone$ to get $A_2$. Then apply the lemma again to $A_2$.
<!-- eng end -->
$Ans:$
Let $\bv_1 = \begin{bmatrix}
1 \\ 1 \\ 1 \\ 1 \\
\end{bmatrix}.$ $\beta = \{\bv_1,\bv_2,\bv_3,\bv_4\}$,
$$
Q = \begin{bmatrix}
1 & 0 & 0 & 0\\
1 & 1 & 0 & 0\\
1 & 0 & 1 & 0\\
1 & 0 & 0 & 1
\end{bmatrix}
$$
Then $[f_A]_\beta^\beta = Q^{-1}AQ = \begin{bmatrix}
1 & 0 & 0 & 0\\
-1 & 1 & 0 & 0\\
-1 & 0 & 1 & 0\\
-1 & 0 & 0 & 1\\
\end{bmatrix}\begin{bmatrix}
0 & 1 & 1 & 1\\
1 & 2 & 0 & 0\\
1 & 0 & 2 & 0\\
1 & 0 & 0 & 2\\
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 & 0\\
1 & 1 & 0 & 0\\
1 & 0 & 1 & 0\\
1 & 0 & 0 & 1\\
\end{bmatrix} = \begin{bmatrix}
3 & 1 & 1 & 1\\
0 & 1 & -1 & -1\\
0 & -1 & 1 & -1\\
0 & -1 & -1 & 1\\
\end{bmatrix}.$
Since we can get an eigenvalue $\lambda_1 = 3$ 。
Let $A_2 = \begin{bmatrix}
1 & -1 & -1 \\
-1 & 1 & -1\\
-1 & -1 & 1 \\
\end{bmatrix}$, $\bv_2 = \begin{bmatrix}
1 \\ 1 \\ 1 \\
\end{bmatrix}。$
$\beta = \{\bv_1,\bv_2,\bv_3\}$,
$$
Q_2 = \begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1 \\
\end{bmatrix}
$$
Then $[f_{A_2}]_\beta^\beta = Q_2^{-1}A_2Q_2 = \begin{bmatrix}
1 & 0 & 0 \\
-1 & 1 & 0 \\
-1 & 0 & 1 \\
\end{bmatrix}\begin{bmatrix}
1 & -1 & -1 \\
-1 & 1 & -1 \\
-1 & -1 & 1 \\
\end{bmatrix}\begin{bmatrix}
1 & 0 & 0 \\
1 & 1 & 0 \\
1 & 0 & 1 \\
\end{bmatrix} = \begin{bmatrix}
-1 & -1 & -1\\
0 & 2 & 0 \\
0 & 0 & 2 \\
\end{bmatrix}.$
We can get $\lambda_2 = -1$ 。
Let matrix $A_3 = \begin{bmatrix}
2 & 0 \\
0 & 2 \\
\end{bmatrix}$, then $\spec(A_3) = \{2,2\}$。
Thus, $\spec(A) =\{3\}\cup\{-1\}\cup\{2,2\} = \{-1,2,2,3\}$。
##### Exercise 4
令
$$
A = \begin{bmatrix}
0 & 1 & 0 \\
0 & 0 & 1 \\
1 & 0 & 0
\end{bmatrix}.
$$
<!-- eng start -->
Let
$$
A = \begin{bmatrix}
0 & 1 & 0 \\
0 & 0 & 1 \\
1 & 0 & 0
\end{bmatrix}.
$$
<!-- eng end -->
##### Exercise 4(a)
令 $\omega = e^{\frac{2\pi}{3}i}$ 且
$$
\bv = \begin{bmatrix} 1 \\ \omega \\ \omega^2 \end{bmatrix}.
$$
求出 $\bv$ 所對應的特徵值 $\lambda$,
並說明如何找到一個么正矩陣 $Q$ 使得
$$
Q^* AQ = \begin{bmatrix}
\lambda & * \\
\bzero & A_2
\end{bmatrix}.
$$
<!-- eng start -->
Let
$\omega = e^{\frac{2\pi}{3}i}$ and
$$
\bv = \begin{bmatrix} 1 \\ \omega \\ \omega^2 \end{bmatrix}.
$$
Find the eigenvalue $\lambda$ corresponding to $\bv$. Then explain how to find a unitary matrix $Q$ such that
$$
Q^* AQ = \begin{bmatrix}
\lambda & * \\
\bzero & A_2
\end{bmatrix}.
$$
<!-- eng end -->
##### Exercise 4(b)
令
$$
\bv = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}.
$$
求出 $\bv$ 所對應的特徵值 $\lambda$,
並說明如何找到一個實垂直矩陣 $Q$ 使得
$$
Q\trans AQ = \begin{bmatrix}
\lambda & * \\
\bzero & A_2
\end{bmatrix}.
$$
<!-- eng start -->
Let
$$
\bv = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}.
$$
Find the eigenvalue $\lambda$ corresponding to $\bv$. Then explain how to find a unitary matrix $Q$ such that
$$
Q^* AQ = \begin{bmatrix}
\lambda & * \\
\bzero & A_2
\end{bmatrix}.
$$
<!-- eng end -->
##### Exercise 4(c)
令 $\omega = e^{\frac{2\pi}{3}i}$ 且
$$
\bv = \begin{bmatrix} 1 \\ \omega \\ \omega^2 \end{bmatrix}.
$$
已知 $\bv$ 所對應的特徵值為 $\omega = a + bi$。
令 $\bv = \bx + \by i$,也就是 $\bx$ 和 $\by$ 分別為 $\bv$ 的實部和虛部向量。
驗證
$$
\begin{aligned}
A \bx &= a\bx - b\by, \\
A \by &= b\bx + a\by.
\end{aligned}
$$
並說明如何找到一個可逆矩陣 $Q$ 使得
$$
Q^{-1} AQ = \begin{bmatrix}
a & b & * \\
-b & a & * \\
0 & 0 & A_2
\end{bmatrix}.
$$
<!-- eng start -->
Let
$\omega = e^{\frac{2\pi}{3}i}$ 且
$$
\bv = \begin{bmatrix} 1 \\ \omega \\ \omega^2 \end{bmatrix}.
$$
It is known that $\omega = a + bi$ is the eigenvalue corresponding to $\bv$. Let $\bv = \bx + \by i$ such that $\bx$ and $\by$ are both real vectors.
Verify that
$$
\begin{aligned}
A \bx &= a\bx - b\by, \\
A \by &= b\bx + a\by.
\end{aligned}
$$
Then find the eigenvalue $\lambda$ corresponding to $\bv$. Then explain how to find a unitary matrix $Q$ such that
$$
Q^{-1} AQ = \begin{bmatrix}
a & b & * \\
-b & a & * \\
0 & 0 & A_2
\end{bmatrix}.
$$
<!-- eng end -->
##### Exercise 5
令 $Q$ 為一 $n\times n$ 么正矩陣,而 $\bx,\by\in\mathbb{C}^n$。
證明 $\inp{\bx}{\by} = \inp{Q\bx}{Q\by}$。
(上述性質在當 $Q$ 是實垂直矩陣而 $\bx$ 和 $\by$ 為實向量時也對。)
這表示 $\bv\mapsto Q\bv$ 這個動作不會改變 $\bv$ 的長度,
因此么正矩陣和實垂直矩陣常被視為高維度的鏡射和旋轉。
(我們沒有說清楚高維度的鏡射和旋轉是什麼意思。)
<!-- eng start -->
Let $Q$ be an $n\times n$ unitary matrix and $\bx,\by\in\mathbb{C}^n$. Show that $\inp{\bx}{\by} = \inp{Q\bx}{Q\by}$. (The same statement also holds when $Q$ is a real orthogonal matrix and $\bx$ and $\by$ are real vectors.)
Therefore, the mapping $\bv\mapsto Q\bv$ preserves the length of any vector $\bv$, so unitary matrices and real orthogonal matrices are usually viewed as reflections or rotations in higher dimensions. (However, we did not clarify the meaning of reflections and rotations.)
<!-- eng end -->
<font color="f300">Ans:</font>
Since the inverse of $Q$ equals it conjugate transpose, which means that $Q^*Q=I$. We can know that $\inp{\bx}{\by} = \inp{Q^*Q\bx}{Q\by} = \inp{Q\bx}{Q\by}$.
---
##### Exercise 6
固定一個正整數 $n$。
令 $\zeta = e^{\frac{2\pi}{n}i}$,
並令 $Q$ 為一 $n\times n$ 矩陣,
其第 $a,b$-項為 $\zeta^{a-1}{b-1}$。
證明 $Q$ 是一個么正矩陣。
(這個矩陣稱為**離散傅立葉變換矩陣** 。)
<!-- eng start -->
Fix a positive integer $n$. Let $\zeta = e^{\frac{2\pi}{n}i}$ and $Q$ the $n\times n$ matrix whose $a,b$-entry is $\zeta^{a-1}{b-1}$.
Show that $Q$ is a unitary matrix. (The matrix $Q$ is known as the **discrete Fourier transform matrix** .)
<!-- eng end -->
:::info
collaboration: 2
3 problems: 3
- 2ab, 3
extra: 1.5
- 2cd, 5
moderator: 1
qc: 1
:::