owned this note
owned this note
Published
Linked with GitHub
# $\mathbb{R}^n$ 中的向量表示法
Vector representation in $\mathbb{R}^n$

This work by Jephian Lin is licensed under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/).
$\newcommand{\trans}{^\top}
\newcommand{\adj}{^{\rm adj}}
\newcommand{\cof}{^{\rm cof}}
\newcommand{\inp}[2]{\left\langle#1,#2\right\rangle}
\newcommand{\dunion}{\mathbin{\dot\cup}}
\newcommand{\bzero}{\mathbf{0}}
\newcommand{\bone}{\mathbf{1}}
\newcommand{\ba}{\mathbf{a}}
\newcommand{\bb}{\mathbf{b}}
\newcommand{\bc}{\mathbf{c}}
\newcommand{\bd}{\mathbf{d}}
\newcommand{\be}{\mathbf{e}}
\newcommand{\bh}{\mathbf{h}}
\newcommand{\bp}{\mathbf{p}}
\newcommand{\bq}{\mathbf{q}}
\newcommand{\br}{\mathbf{r}}
\newcommand{\bx}{\mathbf{x}}
\newcommand{\by}{\mathbf{y}}
\newcommand{\bz}{\mathbf{z}}
\newcommand{\bu}{\mathbf{u}}
\newcommand{\bv}{\mathbf{v}}
\newcommand{\bw}{\mathbf{w}}
\newcommand{\tr}{\operatorname{tr}}
\newcommand{\nul}{\operatorname{null}}
\newcommand{\rank}{\operatorname{rank}}
%\newcommand{\ker}{\operatorname{ker}}
\newcommand{\range}{\operatorname{range}}
\newcommand{\Col}{\operatorname{Col}}
\newcommand{\Row}{\operatorname{Row}}
\newcommand{\spec}{\operatorname{spec}}
\newcommand{\vspan}{\operatorname{span}}
\newcommand{\Vol}{\operatorname{Vol}}
\newcommand{\sgn}{\operatorname{sgn}}
\newcommand{\idmap}{\operatorname{id}}
\newcommand{\am}{\operatorname{am}}
\newcommand{\gm}{\operatorname{gm}}
\newcommand{\mult}{\operatorname{mult}}
\newcommand{\iner}{\operatorname{iner}}$
```python
from lingeo import random_int_list, random_good_matrix
```
## Main idea
Recall that $\mathcal{E}_n = \{ \be_1, \ldots, \be_n \}$ is the standard basis of $\mathbb{R}^n$.
For any vector $\bv = (c_1, \ldots, c_n)\in\mathbb{R}^n$, it can be written as
$$
\bv = c_1\be_1 + \cdots + c_n\be_n.
$$
Similarly, let $\beta = \{ \bu_1, \ldots, \bu_n \}$ be a basis of $\mathbb{R}^n$.
Every vector $\bv\in\mathbb{R}^n$ has a unique way to be written as a linear combination
$$
\bv = c_1\bu_1 + \cdots + c_n\bu_n.
$$
We call the vector $(c_1,\ldots, c_n)\in\mathbb{R}^n$ the **vector representation** of $\bv$ with respect to the basis $\beta$, denoted as $[\bv]_\beta$.
Since every vector in $\mathbb{R}^n$ can be written as a linear combinatoin of $\beta$ and the way of writing it is unique, it is a one-to-one correspondence between $\bv$ and $[\bv]_\beta$.
Let let $\beta = \{ \bu_1, \ldots, \bu_n \}$ be a basis of $\mathbb{R}^n$ and
$A$ the $n\times n$ matrix whose columns are vectors in $\beta$.
Since $\beta$ is a basis, $A$ is invertible.
By definition,
$$
A[\bv]_\beta = \bv \text{ and } A^{-1}\bv = [\bv]_\beta.
$$
When $\beta$ is the standard basis of $\mathbb{R}^n$, $A = I_n$ and $[\bv]_\beta = \bv$.
Therefore, our usual way of writing a vector is the vector representation with respect to the standard basis.
In the case when $\beta$ is an orthonormal basis, $A$ is an orthogonal matrix and $A^{-1} = A\trans$.
Therefore,
$$
A[\bv]_\beta = \bv \text{ and } A\trans\bv = [\bv]_\beta.
$$
## Side stories
- vector representation algebra
- define new inner product
## Experiments
##### Exercise 1
執行以下程式碼。
令 $\beta = \{ \bu_1, \ldots, \bu_3 \}$ 為 $A$ 的行向量且
已知其為 $\mathbb{R}^3$ 的基底。
<!-- eng start -->
Run the code below. Let $\beta = \{ \bu_1, \ldots, \bu_3 \}$ be the columns of $A$. Suppose $\beta$ is a basis of $\mathbb{R}^3$.
<!-- eng end -->
```python
### code
set_random_seed(0)
print_ans = False
m,n,r = 3,3,3
A = random_good_matrix(m,n,r, bound=3)
x1 = vector(random_int_list(n, 3))
x2 = vector(random_int_list(n, 3))
v1,v2 = A*x1, A*x2
k = choice([3,4,5])
print("A =")
show(A)
print("v1 =", v1)
print("v2 =", v2)
print("k =", k)
if print_ans:
Ainv = A.inverse()
print("[v1]_beta =", Ainv * v1)
print("[v2]_beta =", Ainv * v2)
print("[v1 + v2]_beta =", Ainv * (v1 + v2))
print("[v1]_beta + [v2]_beta =", Ainv * v1 + Ainv * v2)
print("[k * v1]_beta =", Ainv * (k*v1))
print("k * [v1]_beta =", k * Ainv * v1)
print("< [v1]_beta, [v2]_beta > =", (Ainv * v1).inner_product(Ainv * v2))
print("< v1, v2 > =", (v1).inner_product(v2))
```
<font color=blue>
The results from the code are
$$
A = \begin{bmatrix}
1 & 3 & 1 \\
-3 & -8 & -1 \\
0 & -1 & -1
\end{bmatrix}
$$
${\bf v}_1 = (4, -15, 1)$\
${\bf v}_2 = (8,-20, -3)$\
$k=3$
</font>
##### Exercise 1(a)
求出 $[\bv_1]_\beta$ 及 $[\bv_2]_\beta$。
<!-- eng start -->
Find $[\bv_1]_\beta$ and $[\bv_2]_\beta$.
<!-- eng end -->
:::warning
- [x] Is $A^{-1}V$ a typo?
:::
<font color=blue>Answer 1(a):\
First of all, we'll need to find the Inverse of the matrix A\
$$
A = \begin{bmatrix}
1 & 3 & 1 \\
-3 & -8 & -1 \\
0 & -1 & -1
\end{bmatrix}
$$
And since we know that $A$ x $A^{-1}$ = $I$\
we let
$$ \left[\begin{array}{c|c} A & I \end{array}\right] = \left[\begin{array}{ccc|ccc}
1 & 3 & 1 & 1 & 0 & 0 \\
-3 & -8 & -1 & 0 & 1 & 0 \\
0 & -1 & -1 & 0 & 0 & 1
\end{array}\right]
$$
and then we use row operations to reduce $A$ to $I$
$$ \left[\begin{array}{c|c} I & B \end{array}\right] = \left[\begin{array}{ccc|ccc}
1 & 0 & 0 & 7 & 2 & 5 \\
0 & 1 & 0 & -3 & -1 & -2 \\
0 & 0 & 1 & 3 & 1 & 1
\end{array}\right]
$$
It is known that $BA=I$
$B=A^{-1}$
We can then get $[\bv]_\beta$ by:\
$[\bv]_\beta = A^{-1}\bv$
$$[{\bf v}_1]_\beta=A^{-1}{\bf v}_1=\begin{bmatrix}
7 & 2 & 5 \\
-3 & -1 & -2 \\
3 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
4 \\
-15 \\
1\end{bmatrix}=
\begin{bmatrix}
3 \\
1 \\
-2
\end{bmatrix},
$$
$$[{\bf v}_2]_\beta=A^{-1}{\bf v}_2=\begin{bmatrix}
7 & 2 & 5 \\
-3 & -1 & -2 \\
3 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
8 \\
-20 \\
-3\end{bmatrix}=
\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix}.
$$
</font>
##### Exercise 1(b)
判斷是否 $[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta$。
<!-- eng start -->
Check if $[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta$.
<!-- eng end -->
<font color=blue>
Answer 1(b):\
${\bf v}_1 = (4,-15,1)$\
${\bf v}_2 = (8,-20,-3)$\
${\bf v}_1+{\bf v}_2 =(12, -35, -2)$
$$[{\bf v}_1+{\bf v}_2]_\beta=A^{-1}({\bf v}_1+{\bf v}_2)=\begin{bmatrix}
7 & 2 & 5 \\
-3 & -1 & -2 \\
3 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
12 \\
-35 \\
-2\end{bmatrix}=
\begin{bmatrix}
4 \\
3 \\
-1
\end{bmatrix}$$
$$[{\bf v}_1]_\beta + [{\bf v}_2]_\beta=
\begin{bmatrix}
3 \\
1 \\
-2
\end{bmatrix}+\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix}=\begin{bmatrix}
4 \\
3 \\
-1
\end{bmatrix}
$$
We can confirm that $[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta$.
</font>
##### Exercise 1\(c\)
判斷是否 $[k\bv_1]_\beta = k[\bv_1]_\beta$。
<!-- eng start -->
Check if $[k\bv_1]_\beta = k[\bv_1]_\beta$.
<!-- eng end -->
<font color=blue>
<Solution>
<!--As stated above,-->
$$
k=3,\space {\bf v}_1 = (4,-15,1), \space
[{\bf v}_1]_\beta=
\begin{bmatrix}
3 \\1 \\-2
\end{bmatrix}.
$$
Calculate the left side of the equation:
-- First, we get
$$
k{\bf v}_1 = 3\space(4,-15, \space 1)=(12,-45, \space 3).
$$
-- Then, we have
$$
[{k\bf v}_1]_\beta=
A^{-1} \space k{\bf v}_1=
\begin{bmatrix}
7 & 2 & 5 \\
-3 & -1 & -2 \\
3 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
12 \\ -45 \\ 3
\end{bmatrix}
=\begin{bmatrix}
9 \\ 3 \\ -6
\end{bmatrix}.
$$
Calculate the right side of the equation:
$$
k[{\bf v}_1]_\beta=
3\begin{bmatrix}
3 \\1 \\-2
\end{bmatrix}
=\begin{bmatrix}
9 \\3 \\-6
\end{bmatrix}.
$$
Since both sides of the equation have the same results, we say
$[k\bv_1]_\beta = k[\bv_1]_\beta \space$ is true.
</font>
##### Exercise 1(d)
判斷是否 $\inp{\bv_1}{\bv_2} = \inp{[\bv_1]_\beta}{[\bv_2]_\beta}$。
<!-- eng start -->
Check if $\inp{\bv_1}{\bv_2} = \inp{[\bv_1]_\beta}{[\bv_2]_\beta}$.
<!-- eng end -->
Answer:
$$
\inp{\bv_1}{\bv_2} = 32+300-3=329
$$
$$
\inp{[\bv_1]_\beta}{[\bv_2]_\beta}=3+2-2=3
$$
$$
\inp{\bv_1}{\bv_2}≠\inp{[\bv_1]_\beta}{[\bv_2]_\beta}
$$
## Exercises
##### Exercise 2
已知
$$
A = \begin{bmatrix}
1 & 0 & 1 \\
-1 & 1 & -4 \\
5 & -2 & 12
\end{bmatrix}
$$
的反矩陣為
$$
A^{-1} = \begin{bmatrix}
4 & -2 & -1 \\
-8 & 7 & 3 \\
-3 & 2 & 1
\end{bmatrix}.
$$
令 $\beta$ 為 $A$ 的行向量集合。
令 $\bv_1 = (1,1,1)$、
$\bv_2 = (1,2,3)$。
求 $[\bv_1]_\beta$ 和 $[\bv_2]_\beta$。
<!-- eng start -->
It is known that the inverse of
$$
A = \begin{bmatrix}
1 & 0 & 1 \\
-1 & 1 & -4 \\
5 & -2 & 12
\end{bmatrix}
$$
is
$$
A^{-1} = \begin{bmatrix}
4 & -2 & -1 \\
-8 & 7 & 3 \\
-3 & 2 & 1
\end{bmatrix}.
$$
Let $\beta$ be the columns of $A$.
Let $\bv_1 = (1,1,1)$ and $\bv_2 = (1,2,3)$.
Find $[\bv_1]_\beta$ and $[\bv_2]_\beta$.
<!-- eng end -->
<font color=blue>
ANS:
$$[{\bf v}_1]_\beta =A^{-1}{\bf v}_1= \begin{bmatrix}
4 & -2 & -1 \\
-8 & 7 & 3 \\
-3 & 2 & 1
\end{bmatrix}
\begin{bmatrix}
1 \\1\\1\end{bmatrix}=
\begin{bmatrix}
1\\2\\0\end{bmatrix},
$$
$$[{\bf v}_2]_\beta =A^{-1}{\bf v}_2= \begin{bmatrix}
4 & -2 & -1 \\
-8 & 7 & 3 \\
-3 & 2 & 1
\end{bmatrix}
\begin{bmatrix}
1 \\2\\3\end{bmatrix}=
\begin{bmatrix}
-3\\15\\4\end{bmatrix}.
$$
</font>
##### Exercise 3
已知
$$
A = \begin{bmatrix}
\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \\
\frac{1}{\sqrt{3}} & -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \\
\frac{1}{\sqrt{3}} & 0 & -\frac{2}{\sqrt{6}}
\end{bmatrix}
$$
為一垂直矩陣。
令 $\beta$ 為 $A$ 的行向量集合。
令 $\bv_1 = (1,1,1)$、
$\bv_2 = (1,2,3)$。
求 $[\bv_1]_\beta$ 和 $[\bv_2]_\beta$。
<!-- eng start -->
It is known that
$$
A = \begin{bmatrix}
\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \\
\frac{1}{\sqrt{3}} & -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}} \\
\frac{1}{\sqrt{3}} & 0 & -\frac{2}{\sqrt{6}}
\end{bmatrix}
$$
is an orthogonal matrix. Let $\beta$ be the columns of $A$. Let $\bv_1 = (1,1,1)$ and $\bv_2 = (1,2,3)$. Find $[\bv_1]_\beta$ and $[\bv_2]_\beta$.
<!-- eng end -->
:::warning
- [x] $A^T$ --> $A\trans$
:::
<font color=blue>
Answer Exercise 3
$$
A^{-1}=A\trans=\begin{bmatrix}
\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} \\
\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\
\frac{1}{\sqrt{6}} &\frac{1}{\sqrt{6}} & -\frac{2}{\sqrt{6}}
\end{bmatrix}
$$
$$
[{\bf v}_1]_\beta =A\trans{\bf v}_1= \begin{bmatrix}
\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} \\
\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\
\frac{1}{\sqrt{6}} &\frac{1}{\sqrt{6}} & -\frac{2}{\sqrt{6}}
\end{bmatrix}
\begin{bmatrix}
1 \\1\\1\end{bmatrix}=
\begin{bmatrix}
{\sqrt{3}}\\0\\0\end{bmatrix}
$$
$$
[{\bf v}_2]_\beta =A\trans{\bf v}_2= \begin{bmatrix}
\frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} \\
\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\
\frac{1}{\sqrt{6}} &\frac{1}{\sqrt{6}} & -\frac{2}{\sqrt{6}}
\end{bmatrix}
\begin{bmatrix}
1 \\2\\3\end{bmatrix}=
\begin{bmatrix}
{2\sqrt{3}}\\\frac{-\sqrt{2}}{2}\\\frac{-\sqrt{6}}{2}\end{bmatrix}
$$
</font>
##### Exercise 4
令 $\beta$ 為 $\mathbb{R}^n$ 中的一組基底。
定義
$$
\begin{aligned}
f : \mathbb{R}^n &\rightarrow \mathbb{R}^n \\
\bv &\mapsto [\bv]_\beta \\
\end{aligned}
$$
為一函數。
<!-- eng start -->
Let $\beta$ be a basis of $\mathbb{R}^n$. Define the function
$$
\begin{aligned}
f : \mathbb{R}^n &\rightarrow \mathbb{R}^n \\
\bv &\mapsto [\bv]_\beta \\
\end{aligned}
$$
<!-- eng end -->
##### Exercise 4(a)
驗證 $f$ 為一線性函數。
<!-- eng start -->
Verify that $f$ is linear.
<!-- eng end -->
:::warning
Well... yes, the outline is correct. But the main emphasis of this problem is to show $[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta$ and $[k\bv_1]_\beta = k[\bv_1]_\beta$ from the definition of the vector representation.
:::
:::info
$$f(\bv)=[\bv]_\beta$$
(i) We know that $[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta.$
Therefore,
$$f(\bv_1 + \bv_2)=[\bv_1 + \bv_2]_\beta = [\bv_1]_\beta + [\bv_2]_\beta=f(\bv_1)+f(\bv_2).$$
(ii) We know that $[k\bv_1]_\beta = k[\bv_1]_\beta.$
Therefore,
$$f(k\bv_1)=[k\bv_1]_\beta = k \space [\bv_1]_\beta=k \space f(\bv_1).$$
By (i) and (ii), we have proved that $f$ is linear.
:::
##### Exercise 4(b)
判斷 $f$ 是否是嵌射。
<!-- eng start -->
Is $f$ injective?
<!-- eng end -->
##### Exercise 4(c)
判斷 $f$ 是否是映射。
<!-- eng start -->
Is $f$ surjective?
<!-- eng end -->
##### Exercise 4(d)
求出 $f$ 的矩陣表示法 $[f]$。
<!-- eng start -->
Find the matrix representation $[f]$ of $f$.
<!-- eng end -->
##### Exercise 5
回顧一個_內積_ $\inp{\cdot}{\cdot}$ 必須符合以下的條件:
1. $\inp{\bx_1 + \bx_2}{\by} = \inp{\bx_1}{\by} + \inp{\bx_2}{\by}$.
2. $\inp{k\bx}{\by} = k\inp{\bx}{\by}$.
3. $\inp{\bx}{\by} = \inp{\by}{\bx}$.
4. $\inp{\bx}{\bx} \geq 0$, and the equality holds if and only if $\bx = \bzero$.
<!-- eng start -->
Recall that an inner product $\inp{\cdot}{\cdot}$ has to have the following properties:
1. $\inp{\bx_1 + \bx_2}{\by} = \inp{\bx_1}{\by} + \inp{\bx_2}{\by}$.
2. $\inp{k\bx}{\by} = k\inp{\bx}{\by}$.
3. $\inp{\bx}{\by} = \inp{\by}{\bx}$.
4. $\inp{\bx}{\bx} \geq 0$, and the equality holds if and only if $\bx = \bzero$.
<!-- eng end -->
##### Exercise 5(a)
令 $\beta$ 為 $\mathbb{R}^n$ 中的一組基底。
定義一個新的雙變數函數 $\inp{\bx}{\by}_\beta = \inp{[\bx]_\beta}{[\by]_\beta}$﹐
其中 $\inp{[\bx]_\beta}{[\by]_\beta}$ 指的是 $\mathbb{R}^n$ 中的標準內積。
驗證 $\inp{\cdot}{\cdot}_\beta$ 也是 $\mathbb{R}^n$ 上的另一種內積。
<!-- eng start -->
Let $\beta$ be a basis of $\mathbb{R}^n$. Define a new bivariate function $\inp{\bx}{\by}_\beta = \inp{[\bx]_\beta}{[\by]_\beta}$, where $\inp{[\bx]_\beta}{[\by]_\beta}$ is the standard inner product on $\mathbb{R}^n$. Verify that $\inp{\cdot}{\cdot}_\beta$ is also an inner product on $\mathbb{R}^n$.
<!-- eng end -->
<font color=blue>
Answer for 5(a):
In order to verify that $\inp{\cdot}{\cdot}_\beta$ is also an inner product on $\mathbb{R}^n$\, we will verify the four properties of a inner product individually and see if they match.
For the first property:
1. $\inp{\bx_1 + \bx_2}{\by}_\beta = \inp{\bx_1}{\by}_\beta + \inp{\bx_2}{\by}_\beta$.
$$\begin{aligned}
\inp{\bx_1 + \bx_2}{\by}_\beta &= \inp{[\bx_1 + \bx_2]_\beta}{[\by]_\beta} \\
&= \inp{[\bx_1]_\beta + [\bx_2]_\beta}{[\by]_\beta} \\
&= \inp{[\bx_1]_\beta}{[\by]_\beta}+\inp{[\bx_2]_\beta}{[\by]_\beta} \\
&= \inp{\bx_1}{\by}_\beta + \inp{\bx_2}{\by}_\beta.
\end{aligned}
$$
The results matches the first property, and is thus verified.
For the second property:\
2. $\inp{k\bx}{\by}_\beta = k\inp{\bx}{\by}_\beta$.
$$\begin{aligned}
\inp{k\bx}{\by}_\beta &= \inp{[k\bx]_\beta}{[\by]_\beta} \\
&= \inp{k[\bx]_\beta}{[\by]_\beta} \\
&= k\inp{[\bx]_\beta}{[\by]_\beta} \\
&= k\inp{\bx}{\by}_\beta.
\end{aligned}
$$
The results matches the second property, and is thus verified.
For the third property:\
3. $\inp{\bx}{\by}_\beta = \inp{\by}{\bx}_\beta$.
$$\begin{aligned}
\inp{\bx}{\by}_\beta &= \inp{[\bx]_\beta}{[\by]_\beta}\\
&= \inp{[\by]_\beta}{[\bx]_\beta} \\
&= \inp{\by}{\bx}_\beta.
\end{aligned}
$$
The results matches the third property, and is thus verified.
For the forth property:\
4. $\inp{\bx}{\bx}_\beta \geq 0$, and the equality holds if and only if $[\bx]_\beta = \bzero$.
$$\begin{aligned}
\inp{\bx}{\bx}_\beta &= \inp{[\bx]_\beta}{[\bx]_\beta}\\
\end{aligned}
$$
And we know that if we let \
$$\begin{aligned}
A=\bx[\bx_\beta]\\
[\bx_\beta] = A^{-1}x\\
\end{aligned}
$$
Then
$$\begin{aligned}
\inp{\bx}{\bx}_\beta &= \inp{[\bx]_\beta}{[\bx]_\beta}\\
&=\inp{[A^{-1}x]]}{[A^{-1}x]}\\
&=[A^{-1}]^{2}x^{2}
\end{aligned}
$$
And since the target matrix $A$ is within the realm of $\mathbb{R}^n$, we can be sure that all of the entries in $A^{-1}$ will also be real numbers, so that any $[A^{-1}]^{2}$ will also be positive.
Therefore, \
$[A^{-1}]^{2}x^{2}\geq 0$, and the equality holds if and only if $[A^{-1}x]^{2} = \bzero$.
Which verifies the statement that\
$\inp{\bx}{\bx}_\beta \geq 0$, and the equality holds if and only if $[\bx]_\beta = \bzero$.\
And thus the 4th property is verified.
With all four properties of the inner product verified, we have verified that $\inp{\cdot}{\cdot}_\beta$ is also an inner product on $\mathbb{R}^n$.
</font>
##### Exercise 5(b)
證明當 $\beta$ 是單位長垂直基底時﹐任意向量都有 $\inp{\bx}{\by}_\beta = \inp{\bx}{\by}$。
<!-- eng start -->
Show that if $\beta$ is orthonormal, then $\inp{\bx}{\by}_\beta = \inp{\bx}{\by}$ for any vectors.
<!-- eng end -->
:::info
collaboration: 1
4 problems: 4
- done: 2, 3, 4a, 5(a)
- pending: none
extra: 1
moderator: 1
quality control: 1
:::