Maxwill Lin
    • Create new note
    • Create a note from template
      • Sharing URL Link copied
      • /edit
      • View mode
        • Edit mode
        • View mode
        • Book mode
        • Slide mode
        Edit mode View mode Book mode Slide mode
      • Customize slides
      • Note Permission
      • Read
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Write
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Engagement control Commenting, Suggest edit, Emoji Reply
    • Invite by email
      Invitee

      This note has no invitees

    • Publish Note

      Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

      Your note will be visible on your profile and discoverable by anyone.
      Your note is now live.
      This note is visible on your profile and discoverable online.
      Everyone on the web can find and read all notes of this public team.
      See published notes
      Unpublish note
      Please check the box to agree to the Community Guidelines.
      View profile
    • Commenting
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
      • Everyone
    • Suggest edit
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
    • Emoji Reply
    • Enable
    • Versions and GitHub Sync
    • Note settings
    • Note Insights New
    • Engagement control
    • Make a copy
    • Transfer ownership
    • Delete this note
    • Save as template
    • Insert from template
    • Import from
      • Dropbox
      • Google Drive
      • Gist
      • Clipboard
    • Export to
      • Dropbox
      • Google Drive
      • Gist
    • Download
      • Markdown
      • HTML
      • Raw HTML
Menu Note settings Note Insights Versions and GitHub Sync Sharing URL Create Help
Create Create new note Create a note from template
Menu
Options
Engagement control Make a copy Transfer ownership Delete this note
Import from
Dropbox Google Drive Gist Clipboard
Export to
Dropbox Google Drive Gist
Download
Markdown HTML Raw HTML
Back
Sharing URL Link copied
/edit
View mode
  • Edit mode
  • View mode
  • Book mode
  • Slide mode
Edit mode View mode Book mode Slide mode
Customize slides
Note Permission
Read
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Write
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Engagement control Commenting, Suggest edit, Emoji Reply
  • Invite by email
    Invitee

    This note has no invitees

  • Publish Note

    Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

    Your note will be visible on your profile and discoverable by anyone.
    Your note is now live.
    This note is visible on your profile and discoverable online.
    Everyone on the web can find and read all notes of this public team.
    See published notes
    Unpublish note
    Please check the box to agree to the Community Guidelines.
    View profile
    Engagement control
    Commenting
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    • Everyone
    Suggest edit
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    Emoji Reply
    Enable
    Import from Dropbox Google Drive Gist Clipboard
       Owned this note    Owned this note      
    Published Linked with GitHub
    • Any changes
      Be notified of any changes
    • Mention me
      Be notified of mention me
    • Unsubscribe
    --- title: Linear Algebra II tags: - 2020 - LA - NYCU - note author: Maxwill lastUpdated: 2020/06/10 description : Linear Algebra II copyright : "screen shots are mainly from Ming-Hsuan Kang's lecture notes, paste for note clarity and save time only" --- # Linear Algebra II * [NCTU Lecture Notes](https://hackmd.io/r-CG8R7xTZ2S3pRa6JD9FA) * [NCTU 2020 spring](https://hackmd.io/xJD55HhBS-CiMdAiSCNYEg) ## week 1-1 (3/2/2020) * review * linear independency * basis and dimension * dimension theorem * direct sum * matrix representation : $$ Rep_{\alpha,\beta}(T) = (Rep_\beta(T(\vec{\alpha_1}) ...Rep_\beta(T(\vec{\alpha_i})) $$ * change of basis $$ Rep_\alpha(T) = Rep_{\beta,\alpha}(id)Rep_\beta(T)Rep_{\alpha,\beta}(id) $$ * determinant * Invariant Subspace * a subspace $W \subset V\ s.t. \forall w \in W, T(w)\in W$ * direct sum of Invariant Subspace * Eigenspace * def: $E(\lambda) = \{v \in V \mid T(v)=\lambda v \}$ * $ker(A-\lambda I)$, $E(\lambda)\ of\ A$ * attendence 10% * next quiz hint: * matrix representation + diagonization * lecturing * $T:V\rightarrow V$ be linear transformation * For $\vec{v} \in V$ what is the smallest T-invariant subspace? * Theorem : ::: info let $W = span\{v, T(v), T^2(v), T^3(v)...\}$ then 1. $W$ is a T-invariant subspace 2. let $dim(W) = m$, then $$\alpha = \{ \vec{v} , T(\vec{v}), T^2(\vec{v}), ...T^{m-1}(\vec{v})\}$$ is a basis of $W$ 4. $T^m(\vec{v}) = \sum a_iT^i(\vec{v})$ 5. $Rep_{\alpha}(T)$ = \begin{bmatrix} 0 & 0 & ...& a_0 \\ 1 & 0 & ...& a_1 \\ 0 & 1 & ...& ...\\ 0 & 0 & ...& a_{m-1} \end{bmatrix} 6. determinant and characristic function by MI 7. $f_{T\mid _W}(x) = ?$ ::: ::: info Proof: 1. By definition : $W = span\{w, T(w), T^2(w), T^3(w)...\}$ 2. $\exists \ k\ \ni T^k \in span\{w, T(w), T^2(w), T^3(w)...T^{k-1}(w)\}$ for V is finite-dimentional 3. $T^k(w)$ is linear combination of $\alpha$ 4. by induction , $T^{n}$ with $n > k$ is too 5. then $T(\vec{w}) \in W$ is trivial ::: * asked teaher for concept confirmation * max linear independent set = generating set in this case. But teacher emphisize the concept is different * uses $F$ instead of $R$ for generasity ## week 1-2(3/4/2020) * matrix transformation self review * http://www.taiwan921.lib.ntu.edu.tw/mypdf/math02.pdf ![](https://i.imgur.com/bLgkz3Z.png) ![](https://i.imgur.com/dwAcAUN.png) * isomorphic linear transformation <=> invertible linear transformation * ![](https://i.imgur.com/9YRBUfZ.png) * standard representation * ![](https://i.imgur.com/zlKrbvI.png) * ![](https://i.imgur.com/OiKh18w.png) * change of coordinate * ![](https://i.imgur.com/mFl31C2.png) * find eigenvalues and vectors * $A = P^{-1}QP$ * course * [use cyclic subspace to proof Cayley-Hamilton Theorem](https://ccjou.wordpress.com/2011/01/31/%E5%88%A9%E7%94%A8%E5%BE%AA%E7%92%B0%E5%AD%90%E7%A9%BA%E9%96%93%E8%AD%89%E6%98%8E-cayley-hamilton-%E5%AE%9A%E7%90%86/) * invariant subspace is useful for analyzing ## week 2-1(March/9th/2020) * 3/10/2020 ask teacher * teacher explained in detailed and intuively * What's the problem? * Why invariant subspace? * to create more 0 * Why Annihilator? * to get blocks of invariant subspaces * Usage of Cayley-Hamilton Theorem? * find such a f(x) quickly * The next will be Jordan form * what to fill in blocks? * btw, diagonizable and invertible is independent * [link](https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/) ### !T-invariant subspaces * to find a basis $\alpha$ $\ni$ * and extend the basis from span W to span V * then the matrix representation of T : V->V $$ \begin{pmatrix} T|_W & A \\ O & B \end{pmatrix} $$ * if we get direct sum of T-invariant subspaces $$ \begin{pmatrix} T|_{W_1} & O \\ O & T|_{W_2} \end{pmatrix} $$ ### Annihilator - Ann(T) * $L(V,V) \rightarrow \ set\ of \ polynomials$ * $for\ T \in L(V,V),\ Ann(T) = \{f(x)\in F[x]\mid f(T)\ is \ 0\ transformation\}$ * a way to get decomposition of $V$ to direct sum of T-invariant subspaces ### !Cayley-Hamilton Theorem * $f_T(T) \in Ann(T)$ * in this case, the usage is to find a $f(x) \ni f(T)\equiv 0$ * proof: $\forall v \in V$, let T invariant subspace generated by $v$ be $W$ $let\ V=W+W^{'}$ since $[T]_{W+W^{'}}$ can be express as $$ \begin{pmatrix} T|_{W} & A \\ O & B \end{pmatrix} $$ $f_{T|_{W+W^{'}}}(T) = g(x)f_{T|_W}(x)$ Note that g(x) is the $det(\lambda I-B)$ part $f_{T|_W}(v) = 0$ from collagory that followed directly with definition of $T|_W$ $Q.E.D.$ ### !(General Eigenspace)Decomposition of V * goal : $V=ker_\infty(T) \bigoplus Im_\infty(T)$ * goal2 : $V=\bigoplus E_\infty(\lambda)$ * part 1 ![](https://i.imgur.com/lGqruvA.png) ![](https://i.imgur.com/xtfAkVt.png) Note : hint for T-invariance - exchangable ops * part 2 ![](https://i.imgur.com/458oxBu.png) Note: in part 2 if m is inf of all ms satisfy prerequisition then $V = ker(T^{m-1}) \bigoplus Im(T^{m-1})$ does not hold in general confirmed by teacher * part 3 ![](https://i.imgur.com/gwZTqOy.png) * part 4 ![](https://i.imgur.com/zNSsqzg.png) Note: * Theorem 4 * case minimal m = 0 => invertible(rank(n)) => trivial * case minimal m >= 1 => is true * above notes are from Ming-Hsuan Kang * [proof of diagonizability](https://math.okstate.edu/people/binegar/4063-5023/4063-5023-l18.pdf) ### Next Jordan Form * what's in the block exactly ### What Linear Algebra studies * real world problem, natural functions * how to express in good basis(change of basis) * fourier, Laplace... * meaningful, easy to compute * linear transformations * differential * integral * etc. * its quite different from Abstract Algebra by teacher * whereas I think the way to think is similar * just LA emphasize more on linearity and dimension etc. ## Week 2-2 ### quiz problem * pA : calculation of T-invariant subspace * pB : let T : R3->R3 be reflex transformation, show T is diagonizable * hint $T^2 = I$ + HW * I was the first one finished ### other material * [Invariant Subspace]https://math.okstate.edu/people/binegar/4063-5023/4063-5023-l18.pdf ## Week 3-1 ### Nilpotent LT - definition - T is a nilpotent LT - $V = ker_{\infty}(T)$ - $T^k = \vec{0} for\ some\ k$ - use similar technique as T-invariant subspace - find a matrix rep of T $$ \begin{pmatrix} 0 & 0 & 0 & ...&0 \\ 1 & 0 & 0 & ...&0 \\ 0 & 1 & 0 & ...&0 \\ 0 & 0 & 1 & ...&0 \\ ...&...&...&...&0 \end{pmatrix} $$ ### Theorem ## Week 3-2 skipped, 3-3(self study@Saturday) ### !Th. - V can be decomposed with Nilpotent LT(on V) ![](https://i.imgur.com/xHP8xXc.png) #### Proof sketch of part 3(2020/3/23, correct by teacher) :::info let T be nilpotent LT on V of index k+1 choose $v_i$ s.t. $T^{k}(v_i)$ forms a basis of $Im(T^{k})$ Objective : $V = W\bigoplus cyclic(v_i)$ for some T-invariant subspace $W$ Proof: - key : must have **dot diagram** in mind - Induction on k+1, index of T - $W$ as the left part removing higher dimension cyclic subspaces - extension of basis is like finding the current longest given the past longest - $v_i$ past longest - $u_i$ current longest 1. when $k = 0, T = 0, T^0 = I$, holds trivially 2. when k holds - $V = ker(T^k) \bigoplus Fv_i$ - devide V to apply $T^k$ is 0 or non-zero - $V^{'} = ker(T^k)$, $T^{'} = T|_{V^{'}}$ - $T^{'}$ is nilpotent on $V^{'}$ of index k - Now we find a basis of $Im({T^{'}}^{k-1})$, a subspace of $ker(T^k)$ - part original: - from $v_i$ to $T(v_i)$ - ${T^{'}}^{k-1} (T(v_i))$ = $T^k(v_i)$ - l.i. subset of $Im({T^{'}}^{k-1}))$ - part extended: - $u_i$ - $ker(V) = W_0 \bigoplus cyclic(T(v_i)) \bigoplus cyclic(u_i)$ - by induction hypotesis - $T(v_i) \bigoplus u_i$ forms basis of $Im({T^{'}}^{k-1})$ - $V = ker(T^k) \bigoplus Fv_i$ - $V = W_0 \bigoplus cyclic(u_i) \bigoplus cyclic(T(v_i)) \bigoplus Fv_i$ - $V = W \bigoplus cyclic(v_i)$ - since $cyclic(u_i)$ is T-invariant 3. by induction, Q.E.D. - by the proof step can actually see W is cyclic subspaces' direct sum ::: Illustration Diagram ::: spoiler ![](https://i.imgur.com/zDbcml2.jpg) ::: ## HW3 ## Week 4-1(2020/3/23) ### !Jordan Form #### part 1 - for a LT T - $V = \bigoplus ker_\infty(T-\lambda_i I)$ - $V = \bigoplus E_\infty(\lambda_i)$ #### part 2 - $T-\lambda I$ is nilpotent on $E_\infty(\lambda_i)$ - $E_\infty(\lambda_i) = \bigoplus cyclic(v_i)$ #### part 3 - these $cyclic(v_i)$s have a representation of $$ \begin{pmatrix} 0 & 0 & 0 & ...&0 \\ 1 & 0 & 0 & ...&0 \\ 0 & 1 & 0 & ...&0 \\ 0 & 0 & 1 & ...&0 \\ ...&...&...&...&0 \end{pmatrix} $$ #### Conclusion - Jordan Form #### Uniqueness? - General Eigenspace Decomposition (YES) - Cyclic Subsapce Decomposition (No) - but the dimensions are unique - generally, the matrix rep. can be said to be unique #### Steps to find a Jordan form matrix rep. 1. find $f_A(x)$ 2. find **dot diagram** for each $\lambda$ - by observe the nulity of $(T-\lambda I)^k$ ex: 0 <- $T(v_2)$ <- $v_2$ 0 <- $v_1$ $$ \begin{pmatrix} \lambda & 1 & 0 \\ 0 & \lambda & 0 \\ 0 & 0 & \lambda \\ \end{pmatrix} $$ #### Steps to find a Jordan basis 1. just solve it from basis of $E\infty(\lambda)$ #### Case Study 1. $$ \begin{pmatrix} 5 & 7 & 1 & 1 & 5 \\ -2 & -3 & -1 & -1 & -5 \\ -1 & -3 & 1 & 1 & -3 \\ 0 & 0 & 3 & 0 & 1 \\ 3 & 3 & 1 & 0 & 6 \end{pmatrix} $$ 2. let V be a subspace of real funcitons spanned by $\alpha = \{x^{-2}e^x, xe^x, e^x\}$ let D be the differential operator find Jordan form of D and its jordan basis ### Jordan Chevalley Decomposition $A = P^{-1}(D+N)P = P^{-1}DP + P^{-1}NP$ where $P^{-1}DP$ is semi-simple part $P^{-1}DP$ is nilpotent part - $\forall A \in M_n(\mathbb{C})$ there exist unique $D,N \in M_n(\mathbb{C})$ - A = D + N - DN = ND - D is diagonizanle - N is nilpotent ### Advantage of Jordan Form #### power of matrix - by $(D+N)^k$ with the fact that N is nilpotent ## Realse of HW1 Quiz1 Quiz2 - HW1 30/30 - Quiz1 16/20 - Quiz2 20/20 ::: spoiler ![](https://i.imgur.com/DuHu2QN.jpg) ![](https://i.imgur.com/dTMCqs8.jpg) ![](https://i.imgur.com/JkvlEdI.jpg) ::: ## Week 4-2 * Hw and test * approximate Jordan form with diagonizable matrix --- ## Week 5 - next topic - Inner Product Space --- ## Week 5-1 - Inner Prodct Space - Definition over $\mathbb{R}^N$ and $\mathbb{C}^N$ - Definition of orthogonal - General definition when $\mathbb{F}$ is $\mathbb{R}$ or $\mathbb{C}$ - Inner Product of - Continuous Functions - Discrete Signals - Discrete Fourier Transformation(DFT) - meaningful basis - easy-to-compute basis - Theorem: - Every inner product is induced from some basis --- ## Week 5-2 - Normal LT ### Symmetric and Hermitian ### Adjoint #### definition ![](https://i.imgur.com/7Yupv1Q.png) - defined on vector space with inner product - without basis => more inside #### self-adjoint 1. All eigenvalues of T are real. 2. T admits a set of eigenvectors which forms an orthonormal basis of V. (Especially, T is diagonalizable.) 3. Under an orthonormal basis, the conjugate transpose of the matrix representation of T is equal to the matrix representation of T∗ ### Normal and Ker/Im ![](https://i.imgur.com/PsVV0s0.png =70%x) - proof of 5 ![](https://i.imgur.com/fMvmZrz.png =70%x) ### Normal <=> Diagonalizable under orthogonal basis - proof - <= - trivial - => - key : general eigenspaces = eigenspaces --- ## HW5 ![](https://i.imgur.com/vHIsO4Y.png =12%x) ![](https://i.imgur.com/PvfaEYa.png =50%x) --- ## Week 6 - fill in week 5 - diagonization of symmetric matrix - Grandsmith and projection - norm of $C^2$ - 2020/04/07 making up HW5 :::spoiler ![](https://i.imgur.com/Q2fE7D1.jpg) ![](https://i.imgur.com/JH4pW9P.jpg) ![](https://i.imgur.com/M2iTpGt.jpg) ::: **Q: relationship of Rn and Cn Q: induced innerproduct by basis** - finite filed > 0 is not well defined - positive definite - compare relation - inner prouct( >= 0) - only have = 0 [induced inner product](https://math.stackexchange.com/questions/1233384/how-to-choose-an-inner-product-with-respect-to-a-basis-in-such-a-way-that-this-b) --- ## Week 6-1 - Othogonal/Unitary ### Definition - preserve inner product ![](https://i.imgur.com/XZYqbnJ.png =70%x) ### Equivalences ![](https://i.imgur.com/AqDA51E.png =70%x) ### Theorems revisitted ![](https://i.imgur.com/29SKn75.png =70%x) ### Isometric - $isometric$ + $f(\vec{0}) = \vec{0}$ $\iff$ $orthogonal$ - must be LT ![](https://i.imgur.com/gFsrEfr.png =70%x) - proof: theorem ![](https://i.imgur.com/MNqZvVS.png =70%x) - proof: LT ![](https://i.imgur.com/BuyYqU7.png =70%x) #### general isometry - affine map(LT + bias) #### Isometric 2-D case study - all orthogonal matrice in $\mathbb{R}^2$ - rotation/reflection by parametric method ### Det and Eigenvalues of Orthogomal Matrices - $det(A) = +-1$ by $det(AA^T) = det(I) = det(A)^2$ - $T$ is ortho LT, $W$ is $T-invariant$ implies $W^\perp$ is too. - proof is exercise #### Orthogonal 3-D case - $det(A) = +-1$ - exist $\lambda = +-1$ => rotate axis or reflecton axis - by direct sum of T-invariant subspaces(exercise theorem) - the left part is orthogonal matrice of rank 2 with det = 1, which must be a rotation ### Questions **Q: relationship of Rn and Cn Q: induced innerproduct by basis** **Q: orthogonal - normal - symmetric(A*=A) relations** --- ## HW6 ## TA hour by MC kang 2020/4/14, learned a lot - HW6 3-2 - more analytic way, discuss det=+1,-1, only on 3D ### n-reflections theorem [reference link](http://faculty.uml.edu/dklain/orthogonal.pdf) ### more on Orientation(advanced topic) - isometric LT has 2 type! - continuous isometric (rotate) vs no away (reflect) - add dimension, what happen - spin by dimension - orientation preserving orthogonal LT - det +-1 - maintain isometric during continuous changing process! - orientation is 2 for all R^n by 2 reflection = 1 rotation - may not cover in class :( --- ## Week 7 - self study - Review Concepts ### Inner product #### Q: induced basis and induced inner product ### self-adjoint, Hermitian #### conjugate transpose #### self-adjoint Th 1. All eigenvalues of T are real. 2. T admits a set of eigenvectors which forms an orthonormal basis of V. (Especially, T is diagonalizable.) #### self-adjoint, Hermitian - Under an orthonormal basis, the conjugate transpose of the matrix representation of T is equal to the matrix representation of T∗ - A symmetric/Hermitian matrix is a matrix representation of a self-adjoint linear transform under an orthonormal basis. --- ### Normal LT #### def T∗ and T commute #### Properties ![](https://i.imgur.com/HkoBgjm.png) #### Theorem A complex linear transformation is diagonalizable under some orthonormal basis if and only if it is normal. --- ### Orthogonal and unitary #### def T* = T-1 #### n-reflections ## Week 7 quiz - prove cannot find a continuous family of iometry for reflection\ - first show b is inrelevant - by cont. compose cont. (det。F~(t)) --- ## Week 7 - Real Canonical Form ### analysis of real matrix A - pairs of conjecate roots of $f_A(X)$ - pairs of eigenvalues - conclusion - for a eigenvalue $\lambda=a-bi$, and $\vec{v}$ be the corresponding eigenvector - let $v = v_1+iv_2$ - $A(v_1+iv_2) = (av_1+bv_2) + i(-bv_1 + av_2)$ - notice that v1, v2 must be l.i. - else the eigen value is real number - complex block is $$ \begin{pmatrix} a & -b\\ b & a\\ \end{pmatrix} $$ ### analysis of othogonal matrix A - $\lambda_i = e^{-i\theta_i}=cos\theta_i-isin\theta_i$ - complex block is $$ \begin{pmatrix} cos\theta & -sin\theta\\ sin\theta & cos\theta\\ \end{pmatrix} $$ - thus we can see **orthogonal matrix** as **reflextions + 2D-rotations** - notice that $\beta$(change of basis) can be chosen to be orthonormal ![](https://i.imgur.com/2hTaO28.png =70%x) --- ## next topic, quadatic form --- ## Week 7 - Quadratic Form ### Quafratic Form ![](https://i.imgur.com/JKMcS6V.png) ![](https://i.imgur.com/fpuD0wX.png) ### Talor Series Revisited ![](https://i.imgur.com/trVz6Wk.png) - k-th order approximation - 1st + 2nd-order can be used to detemine local max/min ### Example ![](https://i.imgur.com/xFH9JyZ.jpg) ### Case: 2 variable, Binary Quadratic form ![](https://i.imgur.com/MnaSQbW.jpg) ### Case: Ternary calculation example ### General Case: N ![](https://i.imgur.com/zwRHur7.png) ![](https://i.imgur.com/vf4upzF.png) ### Key - Quadratic form is Real Symmetric Matrix - Diagonizable(R) with orthogonal basis --- ## Week 7 - Conic Sections ### Purpose of this chapter - zero set ### Term - G: $ax^2 + bxy + cy^2 + dx + ey + f$ - Q: $ax^2 + bxy + cy^2 + dxz + eyz + fz^2$ - H: $ax^2 + bxy + cy^2$ ### Zero Sets of binary Quadratic Form - key: the signs of two eigenvalues - $sign(\lambda_1) = sign(\lambda_2)$ => {0, 0} - $sign(\lambda_1) \neq sign(\lambda_2)$ => two lines - one of them zero => one line ![](https://i.imgur.com/Y11DB2y.jpg) ### deal with below quadratic terms ![](https://i.imgur.com/3OtM6P5.jpg) ### zero set of non-degenerate ternary quadratic form ![](https://i.imgur.com/Ro4QVvm.jpg) #### Conix sections ![](https://i.imgur.com/N4PAneN.png) ### Conclusion ![](https://i.imgur.com/GDVuxkN.jpg) - let H be $ax^2 + bxy + cy^2$ - with $Z(G) = Z(Q)\cap Z(z-1)$ ~= $Z(Q)\cap Z(z=0) = Z(H)$ - we can judge G by H if non-degenerate --- ## Week 7 - Equivalent Quadratic Forms, Signature ### Equivalent Quadratic Form - def, two quadratic form is called Euivalent iff - can obtain each other by change of basis - but since we want signature => need not to be orthonormal - Diagonal Form - note the simbol usage - $Q(\vec{x}^t)$ - more common to use row vector to repr. variable ### Review, change of variable to diagonal form ![](https://i.imgur.com/1MMbSdv.jpg) - note that $\vec{y}^tD\vec{y} = \sum_i\lambda_iy_i^2$ #### use orthogonal instead of orthonormal basis ![](https://i.imgur.com/O13xj9G.png) #### Standard form : above is to change diagonal matrix to +-1 ![](https://i.imgur.com/JgFP6UQ.png) ### Signature of Real Quadratic Forms ![](https://i.imgur.com/P2gXonW.png) - Note: trace of standard form(i.e., signature) + rank can decide standard form ### Signature <=>(1-to-1) Eq Quadratic Forms - pf: coming chapters ### Example is trivial Q: Why signature --- ## HW 7 ### Apllications of Quadratic Forms(binary and tenary) ### Taylor expansion and extreme values ### Zero set, discussion on degenerate form --- ## Week 8 - Office hour 2020/4/21 ![](https://i.imgur.com/iZ55IXA.png =50%x) ### Taylor Expansion for determining local min/max - terms of poly of $dx_i$ - higher order terms are dominated by lower order ones - multiply of infinity smalls - we use diagonalize technique to make thing easy - chage of variable so that - only square terms is non-zero ### about Trace - $tr(AB) = tr(BA)$ poof by direct calculation - for more, ex: ABC - treat AB as D or BC as D and reduce to 2 matrix case - $tr(ABC) \neq tr(BAC)$ in general - $tr(A) = PDP^{-1}) = tr(DP^{-1}P) = tr(D)$ in this case --- ## Weel 8 - Positive Definite Quadratic Form ### Equivalences and proof - Q is positive definite() - A is positive definite(eigenvalue) - Q has has unique minimun at 0 #### Case of semi ### Theorem of n1 is ### Sylvester's critirien #### Induction Proof ### extreme values at 0? ### Example - Q, not positive definite, by can be semi-positive definite? - should check higher order?(the conclusion of not local extreme is too fast IMO) - A(myself): bad Q, this is not discussion for derivatives(Hessian) --- ## Week 8 - Bilinear Form ### Definition ### Matrix Representation - Note taht tenary up has no matrix repr. ### Coordinate-Free Quadratic Form - isomorphism between Quadratic form and symmetric bilinear form ### Relation with Quadratic form and inner product ### Non-degenerate ### Multilinear form, Tensorproduct, Theorem(Extra) --- ## Week 9 - 2020/4/27 practice exam+ review ### Midterm hint - True & False - Jordan Form - calculations - mini poly and possible jordan forms - Quadratic form - calculations - tennary + binary - local extreme - zero set discussioni - positive definite proof - Symmetric, Normal, etc. - proof of diagonalizability - proof of eigenvalues all real(by inner product and real symmetric) - when have diagonal form(normal, poly with no repitive roots are zero) - Q: poly ### My Q - Quantum Computing learning path - proof of jordan form - about nilpotent LT's cyclic decomposition - about general eigenvalue direct sum --- ## Week 9 - individual office hour @ 2020/4/28 15:00-16:00 ### What does Quntum Computing study - Q: is lie Group/Algebra related? - Yes, bried introduction - A: to make "good" universal gates - traditional bit and gate($(1,0)^N \to (1,0)^M$) - now want universal approximation for wave functions - Me: and apply them efficiently is the algo part - Q: What should I study - a variety of fields - dont need to be deep, but know the essential concepts - cause there are many isomorphic realations between fields - think of the problem on ball and design way to higher dimension ### Main clairifications - Innerprodect space and basis - when is inner product well defined - minimal poly and relation with joran form - eigenspace decomposition(multiply 0 of blocks) - restate some inportent fact - intuition about multilinear form ### Summary Before Midterm #### Jordan Form - general eigenspace decomposition of LT - T-invariant subspace - cyclic subspace decompositioin of Nilpotent LT - nilpotent LT - Cayley-Hamilton Theorem Revisited - Real Canonical Form - Topic: - Generalized kernel - Eigen Space discussion - minimal polinomial and possible jordan form #### Inner Product Space - Inner Product is defined on vector spaces that - F is R or C(or non-finitem, to make >= 0 well defined) - $v^tw^{bar}$ - 3 main concepts, their maxtrix representation, and theorems - $A^*$, (inner product space)adjoint - (matrix) conjugate transpose - Self-Adjoint - $A = A^*$ - Normal - $A, A^*$ commute - Unitary - $A^*A = I$ - Isometric - Self-adjoint - Real symmetric or hermitian - implies: - real eigenvalues - diagonalizable under unitary basis - existence and uniqueness - Normal - A complex linear transformation is diagonalizable under some orthonormal basis if and only if it is normal - proofs are important and interesting #### Bilinear Form - Quadratic from - discussion of zero set - conic cure - discussion of extreme value - Taylor, gradient, Hessian - EQ quadratic form, signature - 1-to-1 relation of symmetric bilinear form with Quadratic form - Inner product as "symmetric" and "positive-definite" "bilinear form" - Idendity if induced by basis - multilinear form and tensor product - example illustration - positive definite bilinear form - Sylvester’s critirien ### Trick - proof v = 0 by <v,v>=0 - proof u-w = 0 - proof space V = W, $W \subset V$ and $dim~W = dim~V$ - proof of practice exam 5.a-b - induction on dimension - consider diagonal matrix first! ### Pictures ::: spoiler ![](https://i.imgur.com/9hFd1sy.jpg) ![](https://i.imgur.com/5lFLoGK.jpg) ![](https://i.imgur.com/lx2wyKK.jpg) ![](https://i.imgur.com/Pa4Mfsp.jpg) ![](https://i.imgur.com/0wVySBq.jpg) ![](https://i.imgur.com/7Qcghdg.jpg) ![](https://i.imgur.com/Vuxy4hM.jpg) ![](https://i.imgur.com/o1beFyz.jpg) ![](https://i.imgur.com/ti5Q8bz.jpg) ![](https://i.imgur.com/OJsx0n9.jpg) ::: ### self study + Q for night office hour - hermitian > normal * + eigenvalues all real? - [link: EQ def normal](https://en.wikipedia.org/wiki/Normal_matrix#Equivalent_definitions) - normal := A* and A commute - normal <=> A* is poly of A - normal <=> A and A∗ can be simultaneously diagonalized - Actually, $A^* = P^tD^{bar}P$ by proof in pdf? - commute <=> Simultaneous Diagonalizability - [Thm. 5.1](https://kconrad.math.uconn.edu/blurbs/linmultialg/minpolyandappns.pdf) - also Thm. 4.11 for equivalence of diagonalizability ### online TA hour - diagonal represent as poly <=> poly n->n need n-1 degree - Lagrange construction - if A want to be represent as poly(B) - if B position i, j is same, A pos i, j has to be same - degree smaller - commute and both diagonalizable <=> Simultaneous Diagonalizability - + AB reltation stated can <=> A can be poly of B - T and T* commute iif T* is poly of T is true - consider after diagonalized(always can do, normal) - T* is just $T^{bar}$ - then use Lagrange construction - proof trick - 5-a ![](https://i.imgur.com/MvWVGWc.png) - about projection - pairwise product is zero transformation - lecture note is wrong - sum is idendity - matrix congruence, quadratic form change of variable - https://en.wikipedia.org/wiki/Matrix_congruence - is undser "orthogonal basis" ### Remainning Q - taylor expansion for extreme value in genreal - my guess, 0, +-, 0, +- ... --- ## Week 13 - SVD ### Best fit subspace #### motivation and eq defs ### left singular values ### best fit subspace and SVD - proof by induction on k ### Note on details - $rank(A^tA) = rank(AA^t) = rank(A)$ - proof by definition + norm def - https://math.stackexchange.com/questions/349738/prove-operatornamerankata-operatornameranka-for-any-a-in-m-m-times-n - $A^tA$ is symmetic - $A^tA$ is semi-positive definite - $A^tA$ has **orthonormal eigen decomposition** ### relation with right singular value - $Av_i = \sqrt{\lambda_i}u_i$ - can proof this will be left eighebasis for A^t ### SVD - the decoomposition - standard basis => alpha{v} => sinvular value diagonal matrix(m*n) => beta => standard basis - $U\sum V^t$ - $Av_i = \sqrt{\lambda_i}u_i$ - $A = \sum _{i=1}^r \sqrt{\lambda_i}u_iv_i^t$ - obs: sum of rank one matrices ### compact SVD - use only $m*r, r*r, r*n$ ### confusion - not famil ### SVD view of best fit k-subspace ### Compression with SVD ### PCA - best fit affine subspace - max variance subspace ### SVD vs PCA - care mean or not --- ## HW 9 @ Week 13 - last week take a leaf - https://hackmd.io/Pcp80W3CT26eKqhUwIxzpw --- ## Week 13 - Spectral Drawing - Vertex => n-dimensional e vectors - Now want to project to subspace W - Minimal "Edge Energy Function" - define as the sum of square of distance of edges in subspace W - If want minimal => take eigenspaces of Laplance matrix - can show by definition trivially - want each c.c. has the same projection is best => eigenspaces - used on similarity graph => spectral clustering - If want minimal + orthogonal to eigenspace of (connected) graph - this is spectral drawing ## HW 10 - Spectral Drawing - [LA HW 10 Spectral Drawing](/A0lGBNVlSQ-Ss7A5fluKsQ) --- ## Week 14 - Matrix Exponential ### Motivation - differential equation solution ### Definition - $exp(At) = lim_{n\to\infty} \sum_{m=1}^{n}\frac{A^mt^m}{m!}$ ### Matrix Limit - Convergence, Abs Convergence, Complex Convergence - Def: Matrix Limit is Entry-wise - Product of Matrix Limit - by elementwise discussion ### Discussion of Jordan Block (exp(Jt)) ### Solution of linear system of DE --- ## Week 14 - Discrete-Time Markov Chain - the lecture - capture the eigen-related features of transition matrix - discuss the asymptotic trend when apply the same P infty time ### Positive Matrix - defined by entry ### P(Probability) vector and Transition Matrix - P-vector - entry sum to 1 - T-matrix - each column is a P-vector ### Eigenvalues of Positive Transition Matrix abs "<= 1" - proof ![](https://i.imgur.com/XNaGNfd.png) ### The eigenvalue "1" - 1 is the only (in complex number set) eigenvalue with abs 1 - triangular inequality #### Geometric Multiplicity = 1 - reminder: geo multiplicity = how many Jordan block = rank of eigenspace - proof - to make = 1 - required $|v_i| = |v_j|~\forall i,j$ - refer to last section(proof all eigen abs <= 1) - for replace j with i - since P is positive - for take in the abs - all v_i have same sign - HW explicitly proof this - conclusion - $v = v_i(1,1,1,...,1) = v_i\vec{1}$ is the only eigenvector of abs 1 eigenvalue #### All Jordabn block of eigenvalue one is of size 1 - reminder: dot diagram - proof by contradiction ![](https://i.imgur.com/fRkDlL6.jpg) ### Asymptotic behavior of $\vec{\pi}^{(k)}$ #### $P^t~and~P$ share the same characteristic poly ![](https://i.imgur.com/N2jPRIT.png) #### Decompose Space into Directsum with Jordan form result ![](https://i.imgur.com/zNv82qv.png) #### abs(Eigenvalue) < 1 => go to 0 ![](https://i.imgur.com/GAoxLvu.jpg) #### conclusion ![](https://i.imgur.com/24hm8JU.png) ### Conclusion - take kernel of $P-I$, get the only stationary p-vector! --- ## Week 14 : Q on transition matrix - what happen if the P is "stuck"? - A, B, C state - A to A, B to C, C to B - start A => A - start B/C => 0.5B + 0.5C ![](https://i.imgur.com/aXFdy0i.png) - ans : "positive" = > rank = 1 - will eigenvalue of P be all positive? or 0 - 0 means non-trivial nullspace exist ### HW 11 - if $P^k$ is positive, how about P - can use positive transition result - hint: P eigenvalue $\lambda$ => P^k eigenvalue $\lambda^k$ --- ## Week 15 - Page Rank - trivial --- ## Week 15 - Commuting LT ### Eigenspace of T1 is T2-invariant subspace - proof by $\forall v \in E_{T_1}(\lambda), T_1T_2v = T_2T_1v=T_2\lambda v$ - $E_{T_1}(\lambda)$ is T2 invariant ### Simutaneously Diagonalizable - restriction of a diagonalizable LT on an invariant subspace is still diagonalizable ### What about Jordan - nilpotent counter example ### Spectral drawing and symmetric - $\sigma : V \to V$ be LT that change the vertices is still same graph - than $L$(the Laplance) and $\sigma$ is commutable - pick the eigenspaces as a whole => symmetric! - since the picked eigenspaces are $\sigma -invariant$ --- ## Week 15 - Dual Vector Spaces ### Dual Vector Space ![](https://i.imgur.com/Y1m8TgO.png) ### Dual basis ![](https://i.imgur.com/zykVJsg.jpg) - note that all isomorphisms are based on a certain basis ### Extra structures can be provided by dual space ![](https://i.imgur.com/aHauR3y.png) - Q this ![](https://i.imgur.com/Rlb3tiS.png) ### Dual space of inner product space has natural dual basis - use f_v(w) = <w, v> - if v_i forms orthonormal basis ### pullback ![](https://i.imgur.com/G2XVvs6.png) - let T be a V to W, f be a W to F in W* - get a V to F by V to W and W to F ### adjoint ![](https://i.imgur.com/zo2u0va.jpg) - generalized to for V, W - $T: V \to W$ to $T^*: W \to V$ by a natural way --- ## Week 15 - HW12 s interesting ![](https://i.imgur.com/wnNBFOS.png) ### solution ## Reference Book - Invariant Subspaces of Matrices with Applications ---

    Import from clipboard

    Paste your markdown or webpage here...

    Advanced permission required

    Your current role can only read. Ask the system administrator to acquire write and comment permission.

    This team is disabled

    Sorry, this team is disabled. You can't edit this note.

    This note is locked

    Sorry, only owner can edit this note.

    Reach the limit

    Sorry, you've reached the max length this note can be.
    Please reduce the content or divide it to more notes, thank you!

    Import from Gist

    Import from Snippet

    or

    Export to Snippet

    Are you sure?

    Do you really want to delete this note?
    All users will lose their connection.

    Create a note from template

    Create a note from template

    Oops...
    This template has been removed or transferred.
    Upgrade
    All
    • All
    • Team
    No template.

    Create a template

    Upgrade

    Delete template

    Do you really want to delete this template?
    Turn this template into a regular note and keep its content, versions, and comments.

    This page need refresh

    You have an incompatible client version.
    Refresh to update.
    New version available!
    See releases notes here
    Refresh to enjoy new features.
    Your user state has changed.
    Refresh to load new user state.

    Sign in

    Forgot password

    or

    By clicking below, you agree to our terms of service.

    Sign in via Facebook Sign in via Twitter Sign in via GitHub Sign in via Dropbox Sign in with Wallet
    Wallet ( )
    Connect another wallet

    New to HackMD? Sign up

    Help

    • English
    • 中文
    • Français
    • Deutsch
    • 日本語
    • Español
    • Català
    • Ελληνικά
    • Português
    • italiano
    • Türkçe
    • Русский
    • Nederlands
    • hrvatski jezik
    • język polski
    • Українська
    • हिन्दी
    • svenska
    • Esperanto
    • dansk

    Documents

    Help & Tutorial

    How to use Book mode

    Slide Example

    API Docs

    Edit in VSCode

    Install browser extension

    Contacts

    Feedback

    Discord

    Send us email

    Resources

    Releases

    Pricing

    Blog

    Policy

    Terms

    Privacy

    Cheatsheet

    Syntax Example Reference
    # Header Header 基本排版
    - Unordered List
    • Unordered List
    1. Ordered List
    1. Ordered List
    - [ ] Todo List
    • Todo List
    > Blockquote
    Blockquote
    **Bold font** Bold font
    *Italics font* Italics font
    ~~Strikethrough~~ Strikethrough
    19^th^ 19th
    H~2~O H2O
    ++Inserted text++ Inserted text
    ==Marked text== Marked text
    [link text](https:// "title") Link
    ![image alt](https:// "title") Image
    `Code` Code 在筆記中貼入程式碼
    ```javascript
    var i = 0;
    ```
    var i = 0;
    :smile: :smile: Emoji list
    {%youtube youtube_id %} Externals
    $L^aT_eX$ LaTeX
    :::info
    This is a alert area.
    :::

    This is a alert area.

    Versions and GitHub Sync
    Get Full History Access

    • Edit version name
    • Delete

    revision author avatar     named on  

    More Less

    Note content is identical to the latest version.
    Compare
      Choose a version
      No search result
      Version not found
    Sign in to link this note to GitHub
    Learn more
    This note is not linked with GitHub
     

    Feedback

    Submission failed, please try again

    Thanks for your support.

    On a scale of 0-10, how likely is it that you would recommend HackMD to your friends, family or business associates?

    Please give us some advice and help us improve HackMD.

     

    Thanks for your feedback

    Remove version name

    Do you want to remove this version name and description?

    Transfer ownership

    Transfer to
      Warning: is a public team. If you transfer note to this team, everyone on the web can find and read this note.

        Link with GitHub

        Please authorize HackMD on GitHub
        • Please sign in to GitHub and install the HackMD app on your GitHub repo.
        • HackMD links with GitHub through a GitHub App. You can choose which repo to install our App.
        Learn more  Sign in to GitHub

        Push the note to GitHub Push to GitHub Pull a file from GitHub

          Authorize again
         

        Choose which file to push to

        Select repo
        Refresh Authorize more repos
        Select branch
        Select file
        Select branch
        Choose version(s) to push
        • Save a new version and push
        • Choose from existing versions
        Include title and tags
        Available push count

        Pull from GitHub

         
        File from GitHub
        File from HackMD

        GitHub Link Settings

        File linked

        Linked by
        File path
        Last synced branch
        Available push count

        Danger Zone

        Unlink
        You will no longer receive notification when GitHub file changes after unlink.

        Syncing

        Push failed

        Push successfully