---
disqus: ierosodin
---
# Eigenvalues and Eigenvectors
> Organization contact [name= [ierosodin](ierosodin@gmail.com)]
###### tags: `Linear Algebra` `學習筆記`
* $Ax = \lambda x$, where $A$ is a square matrix
* $\lambda$: eigenvalue
* $x$: eigenvector
* $(A - \lambda I)x = 0$
* $(A - \lambda I)$ must be singular $\Rightarrow det(A - \lambda I) = 0$
* eigenvectors are in the $N(A - \lambda I)$
* $A^2x = \lambda^2x$, $A^k = \lambda^kx$, $A^{-1} = \lambda^{-1}x$ (provided $\lambda \neq 0$)
* Example: $S = {\left[
\begin{array}{cc}
2 & 1 \\
1 & 2\\
\end{array}
\right]} {\left[
\begin{array}{cc}
a & b \\
c & d\\
\end{array}
\right]}$
* Find eigenvalue by solving $det(A - \lambda I) = 0$
* $\lambda = \frac{1}{2}\left[a + d\pm \sqrt{(a-d)^2 + 4bc}\right]$
* $\lambda_1 + \lambda_2 = a + d$ = trace of $S$, and $det(A) = \lambda_1\lambda_2$
* so $\lambda = 1, 3$, and eigenvectors $= {\left[
\begin{array}{c}
1 \\
1 \\
\end{array}
\right]}, {\left[
\begin{array}{c}
1 \\
-1 \\
\end{array}
\right]}$
* Symmetric matrices always have real eigenvalues
* Orthogonal eigenvecctors if $\lambda_1 \neq \lambda_2$
* $S$ is symmetric, and $x$ and $y$ are eigenvectors of $S$ corresponding to distinct eigenvalues $\lambda_1$ and $\lambda_2$.
* $\lambda_1 x^Ty = (\lambda_1 x^T)y = (Ax)^Ty = x^T(A^Ty) = x^T(Ay) = x^T(\lambda_2 y) = \lambda_2 x^Ty$
* $\Rightarrow (\lambda - \mu) x^Ty = 0$
* 因為 $\lambda_1 \neq \lambda_2$, $x^Ty = 0$
* Number of nonzero eigenvalue must be smaller than rank of $A$
* 由於 rank of $N(C^T) =$ n - r,所以有 n - r 個 eigenvectors,其 eigenvalues 皆為零,而這些 n - r 個 eigenvectors 為一個 $R^{n - r}$ 的空間。
* Similar matrices
* 對於每個可逆矩陣 $B$,$BAB^{-1}$ 與 $A$ 擁有相同的 eigenvalues。
* For $Ax = \lambda x$, $(BAB^{-1})(Bx) = \lambda(Bx)$
* Side track
* $AB$ 與 $BA$ 有相同的 non-zero eigenvalues
* $ABx = \lambda x \Rightarrow BA(Bx) = \lambda (Bx)$
* $Bx$ 為 $BA$ 的 eigenvectors
* Diagonalizing a matrix
* 假設 $A$ 是一個 $n\times n$ 矩陣,並且有 $n$ 個線性獨立的 eigenvectors, $x_1, x_2, ..., x_n$
* $A$ has a full set of n independent eigenvectors
* $A{\left[
\begin{array}{cccc}
x_1 & x_2 & ... x_n \\
\end{array}
\right]} = {\left[
\begin{array}{cccc}
\lambda_1x_1 & \lambda_2x_2 & ... \lambda_nx_n \\
\end{array}
\right]} \\
= {\left[
\begin{array}{cccc}
x_1 & x_2 & ... x_n \\
\end{array}
\right]} {\left[
\begin{array}{cccc}
\lambda_1 \\
& \lambda_2 \\
& & ... \\
& & & \lambda_n \\
\end{array}
\right]} = X\Lambda$
* $\begin{split}AX = X\Lambda &\Rightarrow A = X\Lambda X^{-1} \\
&\Rightarrow A^k = X\Lambda^KX^{-1}
\end{split}$
* $A^kv = X\Lambda^kX^{-1}v$
* Assuming $v = c_1x_1 + c_2x_2 + ... + c_nx_n \\
\Rightarrow v = {\left[
\begin{array}{cccc}
x_1 & x_2 & ... x_n \\
\end{array}
\right]} {\left[
\begin{array}{c}
c_1 \\
c_2 \\
... \\
c_n \\
\end{array}
\right]} = Xc \\
\begin{split}\Rightarrow A^kv &= X\Lambda^kX^{-1}v = X\Lambda^kc \\
&= X{\left[
\begin{array}{cccc}
c\lambda_1^k \\
& c\lambda_2^k \\
& & ... \\
& & & c\lambda_n^k \\
\end{array}
\right]} \\
&= c_1\lambda_1^kx_1 + c_2\lambda_2^kx_2 + ... + c_n\lambda_nx_n\end{split}$
* Nondiagonalizable matrices
* 有重複的 egienvalues
* GM
* Geometric multiplicity
* 計算 $N(A - \lambda I)$
* AM
* Algebratic multiplicity
* 計算有多少個不同的 eigenvalues
* GM < AM
* $A = {\left[
\begin{array}{cc}
5 & 1 \\
0 & 5 \\
\end{array}
\right]}$, AM = 2, GM = 1
* Matrix A with 0 eigenvalue is not invertible
* Suppose $A$ is square matrix and invertible and, for the sake of contradiction, let 0 be an eigenvalue. Consider, $(A - \lambda I) \cdot v = 0$ with $\lambda = 0$
* $\Rightarrow (A - 0\cdot I)v = 0 \\
\Rightarrow(A - 0)v = 0 \\
\Rightarrow Av = 0$
* We know $A$ is an invertible and in order for $Av = 0 \Rightarrow v = 0$, but $v$ must be non-trivial such that $\det(A - \lambda I) = 0$.
* Here lies our contradiction. Hence, 0 cannot be an eigenvalue.
* Suppose $A$ is square matrix and has an eigenvalue of 0. For the sake of contradiction, lets assume $A$ is invertible.
* Consider, $Av = \lambda v$, with $\lambda = 0$ means there exists a non-zero $v$ such that $Av = 0$. This implies $Av = 0v \Rightarrow Av = 0$
* For an invertible matrix $A$, $Av = 0$ implies $v = 0$. So, $Av = 0 = A\cdot 0$.
* Since $v$ cannot be 0, this means $A$ must not have been one-to-one.
* Hence, our contradiction, $A$ must not be invertible.