# Linear Transformation ## Change of basis > 看看 [3B1B](https://www.youtube.com/watch?v=P2LTAUO1TdA&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=13&ab_channel=3Blue1Brown)。 ### Basics In usual situations, $(1, 0)^T$ and $(0, 1)^T$ are the basis vectors in a 2D plane. Suppose now we have **another basis (i.e., another world / vector space)**, say $\{(2,1)^T ,(3,-2)^T\}$, how can we **express their vectors in terms of our basis**? Note that **in their point of view**, $\{(2,1)^T ,(3,-2)^T\}$ just look like $\{(1,0)^T,(0,1)^T\}$; $\{(2,1)^T ,(3,-2)^T\}$ is **exactly how their basis is represented in ours**. Things getting clearer, isn't it? Now that we have **their basis expressed in terms of ours**, if we want to express their vector $(4, -1)$ in terms of our basis, we can just use the *linear combination* concept: $$ \begin{align*} &\text{In their world}: \\ &\begin{pmatrix}4\\-1\end{pmatrix} = 4\begin{pmatrix}1\\0\end{pmatrix} + (-1)\begin{pmatrix}0\\1\end{pmatrix}. \\ &\text{In our world}:\\ &\begin{pmatrix}a\\b\end{pmatrix} = 4\begin{pmatrix}2\\1\end{pmatrix} + (-1)\begin{pmatrix}3\\-2\end{pmatrix} = \begin{pmatrix}5\\6\end{pmatrix}. \end{align*} $$ See that? **$(5, 6)^T$ is exactly $(4, -1)^T$ in our point of view**. What about the *opposite* direction? The only thing to do is **multiply the inverse matrix**: $$ \begin{pmatrix}2&3\\1&-2\end{pmatrix}^{-1} = {-1\over7}\begin{pmatrix}-2&-3\\-1&2\end{pmatrix} $$ To state this formally: Let $V$ be a vector space. We want a transformation $M:V\to V$ such that $M$ changes the basis from $\{\vec v_1,\vec v_2,\cdots,\vec v_n\}$ to $\{\vec e_1, \vec e_2,\cdots,\vec e_n\}$, i.e. the **standard basis**. For any vector $\vec v = a_1\vec v_1 + \cdots + a_n\vec v_n$, we can transform it into $\vec u = b_1\vec e_1 + \cdots + b_n\vec e_n$ by setting $$ M = \begin{pmatrix}\vec v_1 &\cdots&\vec v_n\end{pmatrix}, $$ and further yields $$ \vec u = M\vec v. $$ In the above case, we have $$ \vec u=\begin{pmatrix}5\\6\end{pmatrix}, M=\begin{pmatrix}2&3\\1&-2\end{pmatrix}, \vec v = \begin{pmatrix}4\\-1\end{pmatrix}. $$ To put it in a more general situation: Suppose we want to change the basis from $\{\vec v_1,\vec v_2,\cdots,\vec v_n\}$ to $\{\vec w_1,\vec w_2,\cdots,\vec w_n\}$. Let $$ \begin{align*} M_V &= \begin{pmatrix}\vec v_1&\cdots&\vec v_n\end{pmatrix}, \\ M_W &= \begin{pmatrix}\vec w_1&\cdots&\vec w_n\end{pmatrix}. \end{align*} $$ Let $\vec v$ be the vector to transform, $\vec u$ be the *intermediate* vector (in the standard basis), and $\vec w$ be the final vector, we have $$ \begin{align*} \vec u &= M_V\vec v, \\ \vec w &= M_W^{-1}\vec u = M_W^{-1}M_V\vec v. \end{align*} $$ ### Apply rotation In the standard basis, a $90^\circ$ counter-clockwise rotaion can be written as $$ R = \begin{pmatrix}0&1\\-1&0\end{pmatrix}. $$ What about the same rotation in $\{(2,1)^T ,(3,-2)^T\}$? To rotation the vector $\vec u=(4,-1)^T$ **in their world**, it turns out that we can: 1. Change the basis to ours, i.e., express $(4, -1)^T$ in our language: $(5, 6)$. 2. Perform the rotation on $(5,6)^T$: $R(5,6)^T$. 3. Change the basis back to $\{(2,1)^T ,(3,-2)^T\}$. The rotated vector is exactly $$ \vec r = (M^{-1}RM)\vec u. $$ Extraoridinary, isn't it? ### Why change the basis? Let's talk a little bit about **eignevalues** and **eigenvectors** first. #### Eigenvalues and Eigenvectors > 看看 [3B1B](https://www.youtube.com/watch?v=PFDu9oVAE-g&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=15&ab_channel=3Blue1Brown)。 Given a linear transformation, say a matrix $M$, an **eigenvector**of $M$ a vector **whose direction is not changed after applying the transformation**, and the **eigenvalue**is the **scale that the corresponding eigenvector got stretched or squashed** after the transformation. Here comes the familiar equation: $$ \begin{align*} &A\vec v = \lambda\vec v = (\lambda I)\vec v \\ \implies& (A-\lambda I)\vec v = \vec 0. \end{align*} $$ For non-zero $\vec v$, this equation is true only if $(A-\lambda I)$ do not have *linearly independent columns*, i.e., $$ \det(A-\lambda I) = 0, $$ solves the equation. If $A$ happens to be a **diagonal matrix**, the eigenvalues of $A$ are exactly the numbers on its main diagonal; certainly, the columns are the eigenvectors! (Note that the columns form a basis.) What if we use eigenvectors as a basis? #### Eigenbasis For a matrix $A$, if we have enough (at least the number of the columns in $A$) eigenvectors to span the whole space, we can make them form a basis, called **eigenbasis**. In the eigenbasis, the original transformation $A$ in fact become **diagonal**! (Since eigenvectors only get stretched or squashed.) Thus, if we want to compute $A^n$, which is hard in the first place, we can attack it differently: $$ E^{-1}AE = A' \implies E^{-1}A^{n}E = ({A'})^n, $$ where $E$ is exactly the change of basis matrix. Since **$A'$ is diagonal**, $({A'})^n$ is easy to compute! After so much effort, we've arrive at one answer to the question: **"why change the basis?"** #### Little trick to compute 2D eigenvalues Let $A$ be a $2\times2$ matrix, with eigenvalues $\lambda_1$ and $\lambda_2$. $$ \begin{align*} &\text{tr}(A) = \lambda_1 + \lambda_2 = 2m, \\ &\det(A) = \lambda_1\lambda_2 = p, \\ \implies& \lambda_1, \lambda_2 = m\pm\sqrt{m^2 - p}. \end{align*} $$ > 從 $\det(A-\lambda I)=0$ 展開化簡後就能得到!見 [3B1B](https://www.youtube.com/watch?v=e50Bj7jn9IQ&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=15&ab_channel=3Blue1Brown)。 ## 總結 Change of basis 說穿了就只是線性變換的一個特例而已,由於觀點的差異,使得他看似難以理解。就讓我們用一般線性變換的眼光來總結吧! 設 $T: V\to V$,自基底 $E = \{\vec {b_1}, \cdots\vec {b_n}\}$ 映射到 $E' = \{\vec{b_1'},\cdots \vec{b_n'}\}$。對於線性變換 $T$,我們可以用矩陣 $A$ 來表示: $$ A = \Big(T(\vec b_1), \cdots T(\vec b_n) \Big) $$ 於是對任何 $\vec x\in V$, $$ A\vec x = \vec{x'} $$ 其中 $\vec x$ 以 $E$ 表示、$\vec{x'}$ 以 $E'$ 表示。以實例來說,$E$ 對應到上述的 $\{\vec v_1,\vec v_2,\cdots,\vec v_n\}$,而 $E'$ 對應到 standard basis。由於 $\vec{v_i}$ 是以 standard basis 的眼光來描述,因此事實上 $\vec{v_i} = T(\vec{b_i})$,亦即 $$ A = (\vec{v_1}, \cdots \vec{v_n}) = M $$ 於是, $$ A\vec x = \vec{x'} = \vec u = M\vec{v} $$ 說穿了依然是左乘矩陣的線性變換。