---
title: Ch6-0
tags: Linear algebra
GA: G-77TT93X4N1
---
# Chapter 6 extra note 0
## Selected lecture notes
> eigenvalue/eigenvector
**Notation:**
* $\mathcal{L}(V, W)$: The set of all linear transformation from vector space $V$ to the vector space $W$.
* $\mathcal{L}(V)$: The set of all linear transformation from vector space $V$ to the vector space $V$.
* $I$: the identity transformation.
:::info
**Definition:**
Let $V$ be a vector space, $T\in\mathcal{L}(V)$, a scalar $\lambda\in F$ is called an eigenvalue if there exists a vector ${\bf v}\in V\setminus\{\bf 0\}$ such that $T({\bf v}) = \lambda {\bf v}$. In addition, ${\bf v}$ is called an eigenvector corresponding to $\lambda$.
:::
**Proposition:**
Let $V$ be a (*finite dimensional*) vector space, $T\in\mathcal{L}(V)$, $\lambda\in F$, the followings are equivalent:
1. There exists ${\bf v}\in V\setminus\{\bf 0\}$ such that $T({\bf v})=\lambda {\bf v}$.
2. There exists ${\bf v}\in V\setminus\{\bf 0\}$ such that $(T-\lambda I)({\bf v})={\bf 0}$.
* $T-\lambda I$ is not one-to-one.
* $T-\lambda I$ is not onto.
> pf:
> Since
> $$
> \text{dim}(V) = \text{dim}(\text{Ker}(T-\lambda I))+\text{dim}(\text{Ran}(T-\lambda I)),
> $$
> and $\text{dim}(\text{Ker}(T-\lambda I))\ge 1$, so $\text{dim}(\text{Ran}(T-\lambda I))< n$ and $T-\lambda I$ is not onto.
4. $\text{Ker}(T-\lambda I)\ne \{\bf 0\}$, or $\text{dim}(\text{Ker}(T-\lambda I))\ge 1$.
5. $T-\lambda I$ is not invertible.
:::info
**Definition:**
Let $V$ be a vector space, $T\in\mathcal{L}(V)$, the $\lambda$-eigenspace is
$$
V_{\lambda} = \{{\bf v}\in V \,|\, T({\bf v}) = \lambda{\bf v}\}.
$$
:::
**Remark:**
${\bf 0}\in V_{\lambda}$ for any $\lambda\in F$.
**Proposition:**
* $V_{\lambda}$ is a vector subspace.
* $\text{Ker}(T) = V_0 = \{{\bf v}\in V \,|\, T({\bf v}) = {\bf 0}\}$.
:::info
**Definition:**
The *geometric multiplicity* of an eigenvalue $\lambda$ of $T$ is defined as $\text{dim}(V_{\lambda})$.
:::
:::danger
Q: Does every linear transformation has an eigenvalue?
:::
**Example 1:** (No eigenvalue)
Let $V=\{\bf 0\}$, $T\in\mathcal{L}(V)$, $T({\bf v})={\bf v}$.
**Example 2:** (No eigenvalue)
Let $V=\mathbb{R}^2$ with scalar field $\mathbb{R}$. We define
$$
T\left(\begin{bmatrix}x\\ y\end{bmatrix}\right) = \begin{bmatrix}-y\\ x\end{bmatrix}.
$$
**Example 3:** (Has eigenvalue)
Let $V=\mathbb{C}^2$ with scalar field $\mathbb{C}$. We define
$$
T\left(\begin{bmatrix}x\\ y\end{bmatrix}\right) = \begin{bmatrix}-y\\ x\end{bmatrix}.
$$
The eigenvalues are $i$ and $-i$.
**Example 4:** (Has eigenvalue)
Let $V=C^1(\mathbb{R})$ (continuously differentiable functions) and define the linear transformation $T(f) = f'$, the first derivative of $f$.
The eigenvalue $\lambda$ should satisfy $T(f)=f' = \lambda f$, so we find the eigenvector $f(x) = e^{\lambda x}$.
Therefore, we have $\lambda\in(-\infty, \infty)$.
---
**Theorem:**
Let $V$ be a vector space, $T\in\mathcal{L}(V)$. Assume $\{\lambda_1, \cdots, \lambda_n\}$ are distinct eigenvalues with corresponding eigenvectors $\{{\bf v}_1, \cdots, {\bf v}_n\}$. Then $\{{\bf v}_1, \cdots, {\bf v}_n\}$ is linearly independent.
* Proof:
> We prove the statement by induction.
>
> Let $n=1$.
> > Since ${\bf v}_1$ is an eigenvector, so ${\bf v}_1\ne {\bf 0}$ and hence $\{{\bf v}_1\}$ is linearly independent.
>
> Assuming that $n=k$ is true, given $k$ distinct eigenvalues with $k$ corresponding eigenvectors, the eigenvectors form a linearly independent set.
>
> Consider $n=k+1$ and given $\{\lambda_1, \cdots, \lambda_{k+1}\}$ distinct eigenvalues with corresponding eigenvectors $\{{\bf v}_1, \cdots, {\bf v}_{k+1}\}$.
>
> > Since $\{{\bf v}_1, \cdots, {\bf v}_{k}\}$ is linearly independent, we only need to check whether or not ${\bf v}_{k+1}\in\text{span}\{{\bf v}_1, \cdots, {\bf v}_{k}\}$.
> >
> > Suppose
> > $$
> > \tag{1}
> > {\bf v}_{k+1} = \alpha_1{\bf v}_1 + \cdots + \alpha_k{\bf v}_k.
> > $$
> > Apply the transformation $T$ on (1) gives
> > $$
> > \tag{2}
> > \lambda_{k+1}{\bf v}_{k+1} = \alpha_1\lambda_1{\bf v}_1 + \cdots + \alpha_k\lambda_k{\bf v}_k.
> > $$
> > Multiplying (1) by $\lambda_{k+1}$ gives
> > $$
> > \tag{3}
> > \lambda_{k+1}{\bf v}_{k+1} = \alpha_1\lambda_{k+1}{\bf v}_1 + \cdots + \alpha_k\lambda_{k+1}{\bf v}_k.
> > $$
> > (2)-(3) then gives
> > $$
> > {\bf 0} =\alpha_1(\lambda_1-\lambda_{k+1}){\bf v}_1 + \cdots + \alpha_k(\lambda_k-\lambda_{k+1}){\bf v}_k.
> > $$
> > Since the eigenvalues are distinct, $\lambda_i-\lambda_{k+1}\ne 0$ for all $i$. Also $\{{\bf v}_1, \cdots, {\bf v}_{k}\}$ is linearly independent, so we must have
> > $$
> > \alpha_1 = \cdots = \alpha_k=0.
> > $$
> > That is, use (1), ${\bf v}_{k+1}={\bf 0}$, which gives a contradiction as ${\bf v}_{k+1}$ is an eigenvector, should not be a zero vector.
> > So ${\bf v}_{k+1}\notin\text{span}\{{\bf v}_1, \cdots, {\bf v}_{k}\}$ and $\{{\bf v}_1, \cdots, {\bf v}_{k+1}\}$ is linearly independent.
>
>
> Therefore, the statement is true by mathematical induction.
**Proposition:**
Suppose $\text{dim}(V)=n$, then $T\in\mathcal{L}(V)$ has at most $n$ eigenvalues.
---
:::info
**Definition:**
Let $V$ be a vector space, $T\in\mathcal{L}(V)$ and $m\in\mathbb{N}$. We define
$$
T^m = \underbrace{T\circ T\circ \cdots \circ T}_\text{$m$ terms}.
$$
:::
:::info
**Definition:**
Let $V$ be a vector space, $T\in\mathcal{L}(V)$ and $p(z)=a_0 + a_1z + \cdots a_mz^m$ is a polynomial, we define a transformation $p(T): V\to V$ as
$$
p(T) = a_0I + a_1T + \cdots a_m T^m.
$$
:::
**Proposition:**
$p(T)\in\mathcal{L}(V)$.
**Proposition:**
Any two polynomial of an linear transformation commute, that is,
$$
p(T)q(T) = q(T)p(T),
$$
where $p(z)$ and $q(z)$ are polynomials.
**Theorem:**
Let $V$ be a *non-zero*, finite dimensional *complex* vector space, let $T\in\mathcal{L}(V)$, then $T$ has at least one eigenvalue.
* Proof:
> Assume $\text{dim}(V)=n$, given ${\bf v}\in V\setminus\{\bf 0\}$, then the set
> $$
> \{{\bf v}, T{\bf v}, \cdots, T^n{\bf v}\}
> $$
> contains $n+1$ vectors and is a linearly dependent set. There exists $a_0, \cdots, a_n$ not all zero such that
> $$
> \tag{4}
> a_0{\bf v}+a_1T{\bf v}+ \cdots +a_nT^n{\bf v} = {\bf 0}.
> $$
>
> Let $m$ be such that $a_m\ne 0$ and $a_{m+1}=\cdots =a_n=0$.
> > It is easy to see that $m\ge 1$.
>
> We define a polynomial $p(z)=a_0+a_1 z+\cdots a_m z^m$. According to [Fundamental theorem of Algebra](https://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra), there exists $\mu_1, \cdots, \mu_m\in\mathbb{C}$ such that
> $$
> \tag{5}
> p(z) = a_m(z-\mu_1)\cdots(z-\mu_m).
> $$
> We can then rewrite (4) similarly as
> $$
> \begin{align}
> {\bf 0} &= a_0{\bf v}+a_1T{\bf v}+ \cdots +a_nT^n{\bf v} \\
> &= p(T){\bf v}\\
> &= a_m(T-\mu_1 I)\cdots(T-\mu_m I){\bf v}.
> \end{align}
> $$
> Since ${\bf v}\ne {\bf 0}$ and $a_m\ne 0$, there must be a $i\in \{1, \cdots, m\}$ such that $\text{Ker}(T-\mu_i I)\ne \{{\bf 0}\}$, that is, $\mu_i$ is an eigenvalue of $T$.
---
Let $V$ be a *non-zero*, finite dimensional *complex* vector space, let $T\in\mathcal{L}(V)$. Suppose $\beta=\{{\bf v}_1, \cdots, {\bf v}_n\}\subset V$ is a basis, then we have a matrix representation of $T$ as
$$
A = [T]_{\beta}\in\mathbb{M}_{n\times n}.
$$
Let ${\bf x}=[x_1, \cdots, x_n]^T\in\mathbb{C}^n$ be an eigenvector of $A$ corresponding to the eigenvalue $\lambda$, so that $A{\bf x} = \lambda x$, we have ${\bf v}=x_1{\bf v}_1 + \cdots +x_n{\bf v}_n$ is an eigenvector of $T$, and $\lambda$ is an eigenvalue of $T$.