# More Matrices
### Eigenvalue/ Eigenvector
- $A$: an $n\times n$ matrix, $v$: a vector $\in \mathbb{R}^n$, $\lambda:$ a scalar
- If there exist a nonzero vector $v$ such that $Av = \lambda v$, we say $v$ is an eigenvector and $\lambda$ is the eigenvalue for $v$
- *Fact:* eigenvectors corresponding to distinct eigenvalues are linearly independent
### Matrix Diagonalization
- *Fact*: not all matrices are diagonalizable
- If an $n \times n$ matrix $A$ is diagonalizable, it can be decomposed as
- $A = PDP^{-1}$
- $D$ is a diagonal matrix with $\lambda_1 \cdots \lambda_n$ diagonal entries
- $P = [p_1 \cdots p_n]$ is an invertible matrix
- $AP = PD$
- $AP = [A p_1 \cdots A p_n]$
- $PD = [\lambda_1 p_1 \cdots \lambda_n p_n]$
- $A p_i = \lambda_i p_i$. Thus, $p_i$ is an eigenvector of $A$ with eigenvalue $\lambda_i$
- Because $P$ is invertible, the column vectors of $P$ (the eigenvectors of $A$) are independent
- $A^n = P D^{n} P^{-1}$ (computationally efficient!)
### Symmetric Matrix
- definition: $A = A^\top$
- *Fact:* symetric matrix is diagonalizable, and its eigenvectors can be chosen to form an [orthonormal basis](https://hackmd.io/@charlotteTYC/prerequisites) of $\mathbb{R}^n$
- $A = U D U^{-1}$ (diagonalization)
- $U$ is an orthogonal matrix
- its column vectors are orthonormal
- $U^\top U = I = UU^\top$
- $U^{-1} = U^\top$
- $A = U D U^\top$
- Quadratic form: $x^{\top}Ax$
- $A$ is a symmetric matrix, $x$ is a vector
- $x^{\top}Ax = x^{\top} UDU^{\top}x$
- Let $y = U^{\top}x$,
$x^{\top}Ax = (x^{\top} U)D(U^{\top}x) = y^{\top}Dy = \Sigma_i \lambda_i y_i^2 \le \lambda_{\text{max}} \Sigma_i y_i^2 = \lambda_{\text{max}}\, y^\top y = \lambda_{\text{max}}\, x^\top x$
- *proof.* $y^\top y = (U^\top x)^\top (U^\top x) = x^\top U U^\top x = x^\top U U^{-1}x = x^\top x$ (orthogonal maxtrix preserves norm)
- $\lambda_{\text{max}}$ (resp. $\lambda_{\text{min}}$): the largest (resp. smallest) eigenvalues among all eigenvalues of $A$
- $\lambda_{\text{min}}\, x^\top x \le x^\top Ax \le \lambda_{\text{max}}\, x^\top x$