{%hackmd @RintarouTW/About %}
# Matrix Algebra
## Vector Space
:::info
$$
vector\ addition\\
[V;+]
\cases{
\forall x,y\in V, x+y=y+x\ (communitive)\\
\forall x,y,z\in V, x+(y+z)=(x+y)+z\ (associative)\\
\forall x\in V,e=0,xe=ex=x\ (identity)\\
\forall x\in V,\exists y\in V,xy=yx=e\ (inverse)\\
}\\
scaler\ operation\\
[V;\cdot]
\cases{
\forall x,y\in V, a\in \mathbb{R}, a(x+y)=ax+ay\ (distributive)\\
\forall x\in V, a,b\in \mathbb{R}, (a+b)x = ax+bx\\
\forall x\in V, a,b\in \mathbb{R}, a(bx) = (ab)x\\
\forall x\in V, 1x=x
}
$$
:::
A Vector Space of a Matrix
$$
V = M_{2\times 3}(\mathbb{R})
$$
Each element in $V$ is a matrix, the elements are not necessarily normal vectors. But we treat them like vectors.
#### Linear Comination
:::info
$$
v_1,v_2,\ldots,v_n,y\in V,\exists a_1,a_2,\ldots,a_n\in \mathbb{R},\\
y=a_1v_1+a_2v_2+\cdots+a_nv_n\iff y\text{ is a linear combination of }v_1,v_2,\ldots,v_n
$$
ex:
$$
y=(2,3)\in \mathbb{R^2}\\
y=2(1,0)+3(0,1)
$$
:::
#### Generation of a Vector Space
Let a set of vectors $\{x_1,x_2,\ldots,x_n\}$ in $V$, such that
$$
\forall y\in V, \exists a_1,a_2,\ldots,a_n\in \mathbb{R},\\
y=a_1x_1+a_2x_2+\cdots+a_nx_n
$$
This set is said, **generate** or **span** $V$, and this set is called a **generating set**.
#### Linear Independent/Dependent
:::info
$$
\cases{
\{v_1,v_2,\ldots,v_n\}\mid v_i\in V\\
a_1,a_2,\ldots,a_n\in \mathbb{R}\\
a_1v_1+a_2v_2+\cdots+a_nv_n=0\\
}\\
\text{This only solution is }
a_1=a_2=\cdots=a_n=0\iff \{v_1,v_2,\ldots,v_n\}\text{ is linear independent}\\
\text{else it's linear dependent}
$$
:::
#### Basis
:::info
$$
\cases{
B=\{v_1,v_2,\ldots,v_n\}\\
B\ generate\ V\\
B\ is\ linear\ independent
}\iff B\text{ is a basis of }V
$$
If $\{v_1,v_2,\ldots,v_n\}$ is a basis of $V$, $\forall y\in V, y=a_1v_1+a_2v_2+\cdots+a_nv_n$,
all vectors in $V$ could be **uniquely expressed as a linear combination of the basis** $(a_1,a_2,\ldots,a_n)$ for each vector($y$) in $V$ is unique.
:::
Basis Size is Constant
If a basis of $V$ contains $n$ elements(vectors), all bases of $V$ contain $n$ elements(vectors).
Dimension of a Vector Space
$\{v_1,v_2,\ldots,v_n\}$ is a basis of $V$, the dimension of $V$ is $n$, denoted $dim(V)=n$.
## Diagonalization
### Eigenvalue and Eigenvector
:::info
$$
A=M_{n\times n}(\mathbb{R})\\
\exists x\in \mathbb{R}^n,\lambda\in \mathbb{R}, Ax=\lambda x\iff\cases{ \lambda\text{ is the eigenvalue of }A\\
x\text{ is called the eigenvector correspoding to the eigenvalue }\lambda
}
$$
#### Characteristic Equation
$$
\underbrace{det(A-\lambda I)}_{\text{Characteristic Polynomial}}=0
$$
From the view of eigenvectors, their directions are not changed by the matrix($A$), so the effects of the matrix works on it would be like a scaling(diagonal) matrix. Indicates the diagonal matrix exists.
:::
#### Linear Independence of Eigenvectors
For nonzero eigenvectors, they're linear independent. $det(\bmatrix{ev_1&ev_2}) \neq 0$
#### Eigenspace
The solution space of $(A-\lambda I)x=0$ is called **eigenspace** of $A$ corresponding to $\lambda$.
#### Diagonalizable Matrix
$$
A\in M_n(\mathbb{R})\\
\exists P, P^{-1}AP\in D(\text{Diangonal Matrix})\\
D\text{ is said to diagonalize the matrix }A\\
P=\bmatrix{ev_1&ev_2&\ldots&ev_n}\text{ev_i is the eigenvector(as column vector) for each eigenvalue}
$$
:::info
Why $P^{-1}AP\in D$?
for ex:
$$
\cases{
A=\bmatrix{a&b\\c&d}\\
\lambda_1,ev_1=\bmatrix{ev_{11}\\ev_{12}}\\
\lambda_2,ev_2=\bmatrix{ev_{21}\\ev_{22}}\\
P=\bmatrix{ev_1&ev_2}\\
P^{-1}P=PP^{-1}=I
}\\
A\cdot ev_1=\lambda_1\cdot ev_1\\
A\cdot ev_2=\lambda_2\cdot ev_2\\
A\cdot P=\bmatrix{\lambda_1\cdot ev_1&\lambda_2\cdot ev_2}\\
P^{-1}AP=\bmatrix{\lambda_1\bmatrix{1\\0}&\lambda_2\bmatrix{0\\1}}=\bmatrix{\lambda_1&0\\0&\lambda_2}=D
$$
:::
等效矩陣 ($A^m = PD^mP^{-1}$)
$$
\begin{array}l
D&=P^{-1}AP\\
D^m &= (P^{-1}AP)^m\\
D^m &= \underbrace{(P^{-1}AP)(P^{-1}AP)\cdots (P^{-1}AP)}_{m\ times}\\
&=P^{-1}APP^{-1}AP\cdots P^{-1}AP\\
&=P^{-1}AIA\cdots IAP\\
&=P^{-1}A^mP\\
\end{array}\implies PD^mP^{-1}=A^m
$$
###### tags: `math` `matrix` `vector`