# 線代CH6定義整理
###### tags: `線代`
> form Linear Algebra ,Stephen Friedberg.
---
## 6.1.1 inner product
#### source: DP342
Let V be a vector space over F. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V. ascalar in F, denoted〈x,y〉, such that for all x, y and z in V and all c in F, the following hold: <br>
1. 〈x + z,y〉= 〈x,y〉+〈z,y〉
2. 〈cx,y〉= c〈x,y〉
3. $\overline{〈x,y〉}$=〈y,x〉, where the bar denotes complex conjugation. $\overline{〈x,y〉}$為共軛複數
4. 〈x,x〉> 0 if x≠0. 純量需為正
- $〈\sum\limits_{i = 1}^n{a_iv_i,y}〉=\sum\limits_{i = 1}^n{a_i〈v_i,y〉}$
## 6.1.2 conjugate transpose / adjiont
#### source: DP344
Let A ∈$M_{m×n}(F)$. We dfine the conjugate transpose or adjoint of A to be the n×m matrix $A^*$such that$(A^*)_{ij} = \overline{A_{ij}}$ for all i,j.
- 此為標準定義內積
## 6.1.3 norm / lenth
#### source: DP346
Let V be an inner product space. For x ∈V, we define the norm or lenth of x by $||x||$ = $\root \of{〈x,x〉}$.
## 6.1.4 orthogonal / perpendicular
#### source: DP348
Let V be an inner product space. Vectors x and y in V are orthogonal ( perpendicular ) if 〈x,y〉= 0. A subset S of V is orthogonal if any two distinct vectors in S are orthogenal. A vector x in V is a unit vector if $||x||$ = 1. Finally, a subset S of V is orthonormal if S is orthogonal and consists entirely of unit vectors.
## 6.2.1 orthonormal basis
#### source: DP354
Let V be an inner product space. A subset of V is an orthonormal basis for V if it is an ordered basis that is orthonormal.
- 即互相垂直且長度為1
## 6.2.2 orthonormal complement
#### source: DP362
Let S be a nonempty subset of an inner product space V.We define $S^⊥$ ( read S perp) to be the set of all vectors in V that are orthogonal to every vector in S; that is , $S^⊥$ = {x∈V: 〈x,y〉= 0 for all y∈S }. The set $S^⊥$ is called th orthogonal complement of S.
## 6.4.1 normal
#### source: DP383
Let V be an inner product space, Let T be a liner operator on V.we say that T is normal if $TT^* = T^*T$. An n×n real or $L^*_A(x) = A^*x$ complex martrix A is normal if $AA^* = A^*A$.
- $TT^* = T^*T → 〈T(x),y〉= 〈x,T^*(y)〉$
> Q.這樣可以對角化嗎? Ans.幾乎可以了!
## 6.4.2 real skew - symmetric matrix
#### source: DP384
$A^t = -A$
## 6.4.3 self-adjiont / Haermitaian
#### source: DP386
Let T be a linear operator on an inner product space V. We say that T is self-adjiont ( Hermitaian ) if $T=T^*$. An n×n real or complex martrix A is self-adjiont ( Hermitaian ) if $A = A^*$.
- self有自己幫助自己的意思
- 如果β是orthogonal base 且含有eigenavectors則可對角化
- 如果T是 self-adjiont 則矩陣代表A也是 self-adjiont
## 6.4.4 positive definite / postitive semidefinite
#### source: DP390
A linear opear T on a finite-dimensional inner product space is called positive definite ( postitive semidefinite) if T is self-adjiont and 〈T(x),x > 0 ( 〈T(x),x >≧0〉) for all x≠0. <br> An n×n matrix A with entries form R or C is call positive definite( positive semidefinite ) if L_A is positive definite( positive semidefinite )
## 6.5.1 unitary operator & orthogonal operator
#### source: DP392
Let T be a linear operator on a finite-dimensional inner product space V(over F). If $||T(x)||$ = $||x||$ for all x∈V, we call T a unnitary operator if F = C and an orthogonal operator if F = R.
- $Q^{-1}AQ = D$ where $Q^{-1} = Q^* Q = I$
## 6.5.2 reflection
#### source: DP395
Let L be a one-dimensional subspace of $R^2$. We may view L as a line in the plane through the orgin A linear opeartor T on $R^2$ is called a reflection of $R^2$ about L if T(x) = x for all x ∈L and T(x) = -x for all x∈$L^⊥$.
## 6.5.3 orthogonal matrix & unitary
#### source: DP395
A square martrix A is called an orthogonal matrix if $A^tA = AA^t = I$ and unitary if $A^*A = AA^*=I$.
## 6.5.4 rigid motion
#### source: DP398
Let V be an real inner product space. A functionf : V→V is called a rigid motion if $||f(x) - f(y)||$ = $||x-y||$ for all x,y∈V.
## 6.6.1 orthogonal projection
#### source: DP411
Let V be an inner product space, and let T : V →V be a projection. We say that T is an orthogonal projection if $R(T)^⫠ = N(T)$ and $N(T)^⫠ =R(T)$
## 6.7.1 the singular values of A and linear transformation(SVD分解定義)
#### source: DP423
Let A be a m×n matrix. We define th singular values of A to be the singular values of the linear transformation $L_A$.
- Let A be a m×n matrix of rank r with the positive singular values $σ_1≧σ_2≧…≧σ_r$, and let $\sum$ be the m×n matrix difined by <br>
$\sum_{ij} = \cases{
σ_i & \text{if i = j≦r}\\
0 & \text{O.W}
}$
- 矩陣版本:<br>
$A_{v_i} = \cases{
\lambda_iu_i & \text{1≦i≦k}\\
0u_i & \text{k≦i≦n}
}$
>$v_i$為定義域的基底;$u_i$為值域的基底。<br>
Then $A = U\sum V^*$, where $\sum$ is 對角矩陣(由eigen value組成)
## 6.7.2 singular value decomposition(SVD分解)
#### source: DP423
Let A be a m×n matrix of rank r with postivesingular values $σ_1≧σ_2≧…≧σ_r$, A foctorize A = U $\sum V^*$ where U and V are unitary matrocies and $\sum$ is the m×n matrix defined as 6.7.1的註解 is called a singular value decomposition of A.
## SVD簡單說
$\exists \beta$ = {$v_1,v_2,...,v_n$} (定義域基底) <br>
$\exists \gamma$ = {$u_1,u_2,...,u_n$} (值域基底)<br>
is orthogonal bases, then: <br>
1. $T(v_i) = \lambda_iu_i$
2. $\lambda^2_i = T^*T$
> $\lambda_i$ = singular value
## 6.7.3 pseudoinverse
#### source: DP426
Let V and W be finite-dimensional inner product spaces over the same field, and let T: V → W be a linear transformation. Let $L:N(T)^⊥ → R(T)$ be the linear transformation defined by L(x) = T(x) for all x$\in N(T)^⊥$. The pseudoinverse( or Moore-Penose generalized inverse) of T, denoted by $T^┼$, is defined as the unque linear transformation from W to V such that<br>
$T^┼(v_i) = \cases{
L^{-1}(y) & \text{for y ∈ R(T)}\\
0 & \text{for y ∈ R(T)^⊥}
}$
- $T: N(T)^⊥ -> Range(T)$
- L = $T_1 \in(T)^⊥$(1-1 & onto)
另一種寫法:
$T^┼(v_i) = \cases{
\lambda_iu_i & \\
0 &
}$
> 適用非1-1的情況[$\because$限制L在Range(T)上,需要垂直(⊥),即可變成唯一]
### Example
A = $\left(
\begin{array}{cc}
1 & 1 \\
0 & 1 \\
1 & 0 \\
1 & 1 \\
\end{array}
\right)$,求$A^┼$
$sol.$<br>
$A^*A = A^TA =$ $\left(
\begin{array}{cccc}
1 & 0 & 1 & 1 \\
1 & 1 & 0 & 1 \\
\end{array}
\right)$$\left(
\begin{array}{cc}
1 & 1 \\
0 & 1 \\
1 & 0 \\
1 & 1 \\
\end{array}
\right)$ = $\left(
\begin{array}{cc}
3 & 2 \\
2 & 3 \\
\end{array}
\right)$<br>
$rank(A^*A) =2$ <br>
$det\left(
\begin{array}{cc}
3 - \lambda & 2 \\
2 & 3 - \lambda \\
\end{array}
\right)$ $=(\lambda - 5)(\lambda-2) = 0$
$\lambda = 5$代入:<br>
$A^*A - 5\lambda=$$\left(
\begin{array}{cc}
-2 & 2 \\
2 & -2 \\
\end{array}\right)$
得到:
$u_1 = \left(
\begin{array}{c}
1\over \sqrt2 \\
1\over \sqrt2 \\
\end{array}\right)$
$u_2 = \left(
\begin{array}{c}
1\over \sqrt2 \\
-1\over \sqrt2 \\
\end{array}\right)$
:::warning
:warning: $u_1$、$u_2$需為orthonormal basis.
:::
如此可得U:
$U = \left(
\begin{array}{cC}
1\over \sqrt2 & 1\over \sqrt2 \\
1\over \sqrt2 & -1\over \sqrt2 \\
\end{array}\right)$
Note $W=AU$
$W=AU=$$\left(
\begin{array}{cc}
1 & 1 \\
0 & 1 \\
1 & 0 \\
1 & 1 \\
\end{array}
\right)$$\left(
\begin{array}{cC}
1\over \sqrt2 & 1\over \sqrt2 \\
1\over \sqrt2 & -1\over \sqrt2 \\
\end{array}\right)=$$\left(
\begin{array}{cc}
\sqrt 2 & 0 \\
1\over \sqrt 2 & -1\over \sqrt 2 \\
1\over \sqrt 2 & 1\over \sqrt 2 \\
\sqrt 2 & 0 \\
\end{array}
\right)=(w_1,w_2)$
求$v_1、v_2$
- $v_i =$$w_i \over \lambda_i$
$v_1=$$w_1 \over \sqrt 5$, $v_2=$$w_2 \over \sqrt 1$
:::info
:information_source: Why? Recall:
$Q^*A^*AQ =$$\left(\begin{matrix}
\lambda^2_{1} & \cdots & 0\\
\vdots & \lambda^2_{k} & \ddots \\
0 & \cdots & a_{mn}
\end{matrix}\right)=W^*W$
$V^*W = V^*AU = \sum$
$<w_i,w_i> = \lambda^2_i<w_j,w_i> = 0$
$\therefore v_i =$$w_i \over \lambda_i$
:::