or
or
By clicking below, you agree to our terms of service.
New to HackMD? Sign up
Syntax | Example | Reference | |
---|---|---|---|
# Header | Header | 基本排版 | |
- Unordered List |
|
||
1. Ordered List |
|
||
- [ ] Todo List |
|
||
> Blockquote | Blockquote |
||
**Bold font** | Bold font | ||
*Italics font* | Italics font | ||
~~Strikethrough~~ | |||
19^th^ | 19th | ||
H~2~O | H2O | ||
++Inserted text++ | Inserted text | ||
==Marked text== | Marked text | ||
[link text](https:// "title") | Link | ||
 | Image | ||
`Code` | Code |
在筆記中貼入程式碼 | |
```javascript var i = 0; ``` |
|
||
:smile: | ![]() |
Emoji list | |
{%youtube youtube_id %} | Externals | ||
$L^aT_eX$ | LaTeX | ||
:::info This is a alert area. ::: |
This is a alert area. |
On a scale of 0-10, how likely is it that you would recommend HackMD to your friends, family or business associates?
Please give us some advice and help us improve HackMD.
Do you want to remove this version name and description?
Syncing
xxxxxxxxxx
基底正交化
This work by Jephian Lin is licensed under a Creative Commons Attribution 4.0 International License.
\(\newcommand{\trans}{^\top} \newcommand{\adj}{^{\rm adj}} \newcommand{\cof}{^{\rm cof}} \newcommand{\inp}[2]{\left\langle#1,#2\right\rangle} \newcommand{\dunion}{\mathbin{\dot\cup}} \newcommand{\bzero}{\mathbf{0}} \newcommand{\bone}{\mathbf{1}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\tr}{\operatorname{tr}} \newcommand{\nul}{\operatorname{null}} \newcommand{\rank}{\operatorname{rank}} %\newcommand{\ker}{\operatorname{ker}} \newcommand{\range}{\operatorname{range}} \newcommand{\Col}{\operatorname{Col}} \newcommand{\Row}{\operatorname{Row}} \newcommand{\spec}{\operatorname{spec}} \newcommand{\vspan}{\operatorname{span}} \newcommand{\Vol}{\operatorname{Vol}} \newcommand{\sgn}{\operatorname{sgn}} \newcommand{\idmap}{\operatorname{id}} \newcommand{\am}{\operatorname{am}} \newcommand{\gm}{\operatorname{gm}} \newcommand{\mult}{\operatorname{mult}} \newcommand{\iner}{\operatorname{iner}}\)
Main idea
Suppose \(U\) is an inner product space.
Then every subspace of \(U\) has an orthonormal basis.
Indeed, there is an algorithm for converting a basis into an orthonormal basis.
This process is referred to as orthogonalization.
Let \(V\) be a subspace of \(U\) and \(S = \{{\bf u}_1, \ldots, {\bf u}_d\}\) a basis of \(V\).
Taking \(S\) as the input, the following steps generates an orthonormal basis \(\hat{S} = \{\hat{\bf u}_1, \ldots, \hat{\bf u}_d\}\) such that \(V = \operatorname{span}(S) = \operatorname{span}(\hat{S})\).
Then \(\hat{\bf u}_{k} = {\bf u}_{k} - \operatorname{proj}_{V_{k-1}}({\bf u}_{k})\), where \(\operatorname{proj}_{V_{k-1}}({\bf u}_k)\) is the projection of \({\bf u}_k\) onto the subspace \(V_{k-1}\).
Repeat this step for \(k = 2,\ldots, d\).
This algorithm is called the Gram–Schmidt process.
Let \(\operatorname{proj}_{\bf v}({\bf u})\) be the projection of \({\bf u}\) onto the line spanned by \({\bf v}\).
Notice here
\[\begin{aligned} {\bf u}_k &= \hat{\bf u}_k + \operatorname{proj}_{V_{k-1}}({\bf u}_k) \\ &= \operatorname{proj}_{\hat{\bf u}_1}({\bf u}_k) + \cdots + \operatorname{proj}_{\hat{\bf u}_{k-1}}({\bf u}_k) + \hat{\bf u}_k \end{aligned} \] since \(\{\hat{\bf u}_1,\ldots,\hat{\bf u}_{k-1}\}\) is orthogonal.
Therefore, \({\bf u}_k \in \operatorname{span}(S_k)\).
Let \(Q\) be the matrix whose columns are vectors in \(\hat{S}\)
and \(A\) the matrix whose columns are vectors in \(S\).
Then \(A = QR\) for some upper triangular matrix.
This decomposition is called the \(QR\) decomposition of \(A\).
Let \(Q\) be an \(m\times n\) matrix.
Then the following are equivalent:
A matrix \(Q\) satisfying one of these conditions is called a column-orthogonal matrix.
Similarly, \(Q\) is row-orthogonal if the set of rows of \(Q\) is an orthonormal basis of \(\mathbb{R}^n\).
When \(Q\) is an \(n\times n\) matrix, then \(Q\) is column-orthogonal if and only if \(Q\) is row-orthogonal.
Such a matrix is an orthogonal matrix.
Side stories
A.QR()
Experiments
Exercise 1
執行以下程式碼。
令 \(S = \{{\bf u}_1,{\bf u}_2,{\bf u}_3\}\) 為 \(A\) 中的各行向量。
利用以下的公式計算:
Exercise 1(a)
計算 \(\hat{S} = \{ \hat{\bf u}_1, \hat{\bf u}_2, \hat{\bf u}_3\}\)。
可以用
projection_on_vector(u, v)
來計算 \(\operatorname{proj}_{\bf v}({\bf u})\)。Exercise 1(b)
判斷 \(\hat{S}\) 是否垂直、是否單位長垂直。
如果不是單位長垂直﹐找出一組單位長垂直的基底。
Exercise 1©
求出一個垂直矩陣 \(Q\) 及一個上三角矩陣 \(R\) 使得 \(A = QR\)。
Exercises
Exercise 2
求出以下空間的垂直基底。
(不一定要把向量縮為單位長。)
Exercise 2(a)
求出 \(V = \{(x,y,z) : x + y + z = 0\}\) 的一組垂直基底。
Exercise 2(b)
求出 \(V = \{(x,y,z,w) : x + y + z + w = 0\}\) 的一組垂直基底。
Exercise 3
令 \(S = \{{\bf u}_1,{\bf u}_2,{\bf u}_3\}\) 為 向量空間 \(V\) 的一組基底﹐且 \(A\) 的各行向量為 \(S\)。
令 \(\hat{S} = \{ \hat{\bf u}_1, \hat{\bf u}_2, \hat{\bf u}_3\}\) 為 \(V\) 的一組單位長垂直基底﹐且 \(Q\) 的各行向量為 \(\hat{S}\)。
已知
\[\begin{aligned} {\bf u}_1 &= \hat{\bf u}_1, \\ {\bf u}_2 &= 2\hat{\bf u}_1 + 3\hat{\bf u}_2, \\ {\bf u}_3 &= 4\hat{\bf u}_1 + 5\hat{\bf u}_2 + 6\hat{\bf u}_3.\\ \end{aligned} \] 求上三角矩陣 \(R\) 使得 \(A = QR\)。
Exercise 4
令 \(QR\) 為 \(A\) 的 \(QR\) 分解。
Exercise 4(a)
說明 \(\operatorname{Col}(A) = \operatorname{Col}(Q)\)。
Exercise 4(b)
因此對 \(\operatorname{Col}(A)\) 和對 \(\operatorname{Col}(Q)\) 的投影應該是一樣的。
說明 \(A(A^\top A)^{-1}A^\top = Q(Q^\top Q)^{-1}Q^\top\)。
Exercise 5
寫一個程式﹐
讓輸入矩陣 \(A\) 後會得出它的 \(QR\) 分解。