YH Hsu
    • Create new note
    • Create a note from template
      • Sharing URL Link copied
      • /edit
      • View mode
        • Edit mode
        • View mode
        • Book mode
        • Slide mode
        Edit mode View mode Book mode Slide mode
      • Customize slides
      • Note Permission
      • Read
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Write
        • Only me
        • Signed-in users
        • Everyone
        Only me Signed-in users Everyone
      • Engagement control Commenting, Suggest edit, Emoji Reply
    • Invite by email
      Invitee

      This note has no invitees

    • Publish Note

      Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

      Your note will be visible on your profile and discoverable by anyone.
      Your note is now live.
      This note is visible on your profile and discoverable online.
      Everyone on the web can find and read all notes of this public team.
      See published notes
      Unpublish note
      Please check the box to agree to the Community Guidelines.
      View profile
    • Commenting
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
      • Everyone
    • Suggest edit
      Permission
      Disabled Forbidden Owners Signed-in users Everyone
    • Enable
    • Permission
      • Forbidden
      • Owners
      • Signed-in users
    • Emoji Reply
    • Enable
    • Versions and GitHub Sync
    • Note settings
    • Note Insights New
    • Engagement control
    • Transfer ownership
    • Delete this note
    • Save as template
    • Insert from template
    • Import from
      • Dropbox
      • Google Drive
      • Gist
      • Clipboard
    • Export to
      • Dropbox
      • Google Drive
      • Gist
    • Download
      • Markdown
      • HTML
      • Raw HTML
Menu Note settings Note Insights Versions and GitHub Sync Sharing URL Create Help
Create Create new note Create a note from template
Menu
Options
Engagement control Transfer ownership Delete this note
Import from
Dropbox Google Drive Gist Clipboard
Export to
Dropbox Google Drive Gist
Download
Markdown HTML Raw HTML
Back
Sharing URL Link copied
/edit
View mode
  • Edit mode
  • View mode
  • Book mode
  • Slide mode
Edit mode View mode Book mode Slide mode
Customize slides
Note Permission
Read
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Write
Only me
  • Only me
  • Signed-in users
  • Everyone
Only me Signed-in users Everyone
Engagement control Commenting, Suggest edit, Emoji Reply
  • Invite by email
    Invitee

    This note has no invitees

  • Publish Note

    Share your work with the world Congratulations! 🎉 Your note is out in the world Publish Note

    Your note will be visible on your profile and discoverable by anyone.
    Your note is now live.
    This note is visible on your profile and discoverable online.
    Everyone on the web can find and read all notes of this public team.
    See published notes
    Unpublish note
    Please check the box to agree to the Community Guidelines.
    View profile
    Engagement control
    Commenting
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    • Everyone
    Suggest edit
    Permission
    Disabled Forbidden Owners Signed-in users Everyone
    Enable
    Permission
    • Forbidden
    • Owners
    • Signed-in users
    Emoji Reply
    Enable
    Import from Dropbox Google Drive Gist Clipboard
       Owned this note    Owned this note      
    Published Linked with GitHub
    32
    • Any changes
      Be notified of any changes
    • Mention me
      Be notified of mention me
    • Unsubscribe
    # [Transformer] Self-Attention與Transformer ###### tags: `Literature Reading` `Self-Attention` `Transformer` `ViT` ### ViT與Transformer相關筆記 - [[Transformer_CV] Vision Transformer(ViT)重點筆記](https://hackmd.io/tMw0oZM6T860zHJ2jkmLAA) - [[Transformer] Self-Attention與Transformer](https://hackmd.io/fmJx3K4ySAO-zA0GEr0Clw)筆記 - [[Self-supervised] Self-supervised Learning 與 Vision Transformer重點筆記與近期發展](https://hackmd.io/7t35ALztT56STzItxo3UiA) - [[Explainable AI] Transformer Interpretability Beyond Attention Visualization。Transformer可解釋性與視覺化](https://hackmd.io/SdKCrj2RTySHxLevJkIrZQ) - [[Transformer_CV] Masked Autoencoders(MAE)論文筆記](https://hackmd.io/lTqNcOmQQLiwzkAwVySh8Q) ### Attention Mechanism概念與脈絡 - Attention is a core ingredient of ‘conscious’ AI (ICLR 2020 Yoshua Bengio) - [ICLR 2020 Yoshua Bengio:注意力是「有意識」AI的核心要素](https://medium.com/@andy6804tw/yoshua-bengio-%E6%B3%A8%E6%84%8F%E5%8A%9B%E6%98%AF-%E6%9C%89%E6%84%8F%E8%AD%98-ai%E7%9A%84%E6%A0%B8%E5%BF%83%E8%A6%81%E7%B4%A0-f091a643d0fc) > 注意力機制(Attention),即人(或算法)一次專注於單個元素或多個元素的機制,是機器學習中的一種數據處理方法,廣泛應用在自然語言處理、圖像識別及語音識別等各種不同類型的機器學習任務中。 > > Bengio在其著作《Thinking, Fast and Slow》中描述了以美國心理學和經濟學家丹尼爾•卡尼曼(Daniel Kahneman)提出的認知系統。 > - 第一種是無意識的——它是直覺的和快速的,非語言和習慣性的,它只處理隱性的知識。 > - 第二種是有意識的——它是語言和算法的,它結合了推理和計劃,以及明確的知識形式。 - 人類的視覺注意力焦點 - 你可以在下面發現幾張臉? - ![](https://i.imgur.com/0Kf9Ha6.png =300x) - 注意力模型家族 ![](https://i.imgur.com/jr8Bo8l.png =500x) - from [An Attentive Survey of Attention Models](https://arxiv.org/pdf/1904.02874.pdf) - [[論文導讀] 注意力機制模型的分類](https://medium.com/ai-academy-taiwan/%E8%AB%96%E6%96%87%E5%B0%8E%E8%AE%80-%E6%B3%A8%E6%84%8F%E5%8A%9B%E6%A9%9F%E5%88%B6%E6%A8%A1%E5%9E%8B%E7%9A%84%E5%88%86%E9%A1%9E-bb827cf5865c) ### Transformer的重要性及突破性概念 #### 第四大類深度學型模型 - Stanford大學在2021年底將transformer列為繼MLP, CNN, RNN之後的第四大類深度學型模型,把以transformer為基礎的一系列模型定義為foundation models。 - [2108.07258 On the Opportunities and Risks of Foundation Models](https://arxiv.org/abs/2108.07258) - [Foundation Models](https://blog.inten.to/foundation-models-b89e7610057) #### 與CNN、RNN對比 ![](https://i.imgur.com/vTDh4uv.png =500x) ![](https://i.imgur.com/Yc4rKT4.png =600x) - 代號註記 - n : sequence length - d : representation dimension, - k : kernel size of convolutions - 自注意力機制特色 - 可以完全平行計算 - 在最大路徑長度(最長序列為1)最低 - 抓取/查詢全局訊息越容易 - CNN的k越大,感受視野範圍越大 - 對長序列的計算複雜度高 ### 原始論文:[Attention is All you Need](https://arxiv.org/abs/1706.03762) #### Transformer模型的整體結構 - 模型架構 ![Transformer](https://i.imgur.com/ZMxXphs.png =300x) - 每個單詞同時去計算自己本身與所有單詞的相似性,來得到注意力分數 - 可理解為在所有詞庫中,建立所有單詞間的相似性分數矩陣 - ![](https://i.imgur.com/TE4WG3X.png =400x) - 所有句子中的每個單詞(token/sequence)本身同時作為Query與Key,Value - Q,K,V是可學習的權重 - 注意力分數 - 即該embedding vector的V(Value) x 關注力權重 - 關注力權重 = Q x K - 即得到該單詞與全部單詞的相關性矩陣 - 透過縮放及Softmax轉為(0,1)間的機率分布 - encoder : - ~~根據全局(全文)訊息,計算每個embedding vector(在NLP領域即每個單詞)對所有詞庫的相關性,獲得該單詞對(embedding vector)所有單詞的注意力分數~~ - decoder : - 以當前embedding vector作為Q,在encoder所學習到的注意力矩陣中進行查詢 - 例如,在主詞後面,接續動詞的機會比較大;在出現下雨時,後面文字出現雨傘的機會就會增加 - 在推論時, - decoder同時考慮1. 當前的embedding vector(當前訊息),與2. encoder(全局/文訊息) - 後面的embedding vector是被遮住的(masked) #### Self-Attention - ![](https://i.imgur.com/56SgiL2.png =200x) - Scaled Dot-Product Attention 縮放後的點積注意力 - 點積: 反映兩個向量間的相似性 ![](https://i.imgur.com/i4yqJyW.png =300x) ![](https://i.imgur.com/XiVdAHa.png =200x) #### Multi-head Attention - 由多個Self-Attention模塊串接組成 - 提供模型彈性、多面向的注意力 - ![](https://i.imgur.com/qGzjL4P.png) #### Position encoding - CNN與RNN帶有位置訊息,但自注意力機制本身沒有 - 鑲嵌在每個單詞(X)上,讓每個單詞帶有位置資訊 - 使模型在學習/推論時考量位置訊息 - 位置編碼 - X + P - P ∈ ℝ^n×d^ : - P : $$ p_{i,2j} = sin\left(\frac{i}{10000^{2j/d}}\right),\; p_{i,2j+1} = con\left(\frac{i}{10000^{2j/d}}\right)$$ - ![](https://i.imgur.com/sr9TaEn.png) ### 推薦學習資源 - [动手学深度学习在线课程](https://courses.d2l.ai/zh-v2/) - [李弘毅老師 2021ML春季班](https://speech.ee.ntu.edu.tw/~hylee/ml/2021-spring.php) - [李沐_Transformer论文逐段精读](https://www.youtube.com/watch?v=nzqlFIcCSWQ&list=PLFXJ6jwg0qW-7UM8iUTj3qKqdhbQULP5I&index=7) - [The Illustrated Transformer](https://jalammar.github.io/illustrated-transformer/) - [Attention机制详解(二)——Self-Attention与Transformer](https://zhuanlan.zhihu.com/p/47282410) - [Transformer 一篇就够了(一): Self-attenstion](https://zhuanlan.zhihu.com/p/345680792) - [illustrated-self-attention-2d627e33b20a_Q(Query), K(Key), V(Value)動畫圖解](https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a)(https://jalammar.github.io/illustrated-transformer/)

    Import from clipboard

    Paste your markdown or webpage here...

    Advanced permission required

    Your current role can only read. Ask the system administrator to acquire write and comment permission.

    This team is disabled

    Sorry, this team is disabled. You can't edit this note.

    This note is locked

    Sorry, only owner can edit this note.

    Reach the limit

    Sorry, you've reached the max length this note can be.
    Please reduce the content or divide it to more notes, thank you!

    Import from Gist

    Import from Snippet

    or

    Export to Snippet

    Are you sure?

    Do you really want to delete this note?
    All users will lose their connection.

    Create a note from template

    Create a note from template

    Oops...
    This template has been removed or transferred.
    Upgrade
    All
    • All
    • Team
    No template.

    Create a template

    Upgrade

    Delete template

    Do you really want to delete this template?
    Turn this template into a regular note and keep its content, versions, and comments.

    This page need refresh

    You have an incompatible client version.
    Refresh to update.
    New version available!
    See releases notes here
    Refresh to enjoy new features.
    Your user state has changed.
    Refresh to load new user state.

    Sign in

    Forgot password

    or

    By clicking below, you agree to our terms of service.

    Sign in via Facebook Sign in via Twitter Sign in via GitHub Sign in via Dropbox Sign in with Wallet
    Wallet ( )
    Connect another wallet

    New to HackMD? Sign up

    Help

    • English
    • 中文
    • Français
    • Deutsch
    • 日本語
    • Español
    • Català
    • Ελληνικά
    • Português
    • italiano
    • Türkçe
    • Русский
    • Nederlands
    • hrvatski jezik
    • język polski
    • Українська
    • हिन्दी
    • svenska
    • Esperanto
    • dansk

    Documents

    Help & Tutorial

    How to use Book mode

    Slide Example

    API Docs

    Edit in VSCode

    Install browser extension

    Contacts

    Feedback

    Discord

    Send us email

    Resources

    Releases

    Pricing

    Blog

    Policy

    Terms

    Privacy

    Cheatsheet

    Syntax Example Reference
    # Header Header 基本排版
    - Unordered List
    • Unordered List
    1. Ordered List
    1. Ordered List
    - [ ] Todo List
    • Todo List
    > Blockquote
    Blockquote
    **Bold font** Bold font
    *Italics font* Italics font
    ~~Strikethrough~~ Strikethrough
    19^th^ 19th
    H~2~O H2O
    ++Inserted text++ Inserted text
    ==Marked text== Marked text
    [link text](https:// "title") Link
    ![image alt](https:// "title") Image
    `Code` Code 在筆記中貼入程式碼
    ```javascript
    var i = 0;
    ```
    var i = 0;
    :smile: :smile: Emoji list
    {%youtube youtube_id %} Externals
    $L^aT_eX$ LaTeX
    :::info
    This is a alert area.
    :::

    This is a alert area.

    Versions and GitHub Sync
    Get Full History Access

    • Edit version name
    • Delete

    revision author avatar     named on  

    More Less

    Note content is identical to the latest version.
    Compare
      Choose a version
      No search result
      Version not found
    Sign in to link this note to GitHub
    Learn more
    This note is not linked with GitHub
     

    Feedback

    Submission failed, please try again

    Thanks for your support.

    On a scale of 0-10, how likely is it that you would recommend HackMD to your friends, family or business associates?

    Please give us some advice and help us improve HackMD.

     

    Thanks for your feedback

    Remove version name

    Do you want to remove this version name and description?

    Transfer ownership

    Transfer to
      Warning: is a public team. If you transfer note to this team, everyone on the web can find and read this note.

        Link with GitHub

        Please authorize HackMD on GitHub
        • Please sign in to GitHub and install the HackMD app on your GitHub repo.
        • HackMD links with GitHub through a GitHub App. You can choose which repo to install our App.
        Learn more  Sign in to GitHub

        Push the note to GitHub Push to GitHub Pull a file from GitHub

          Authorize again
         

        Choose which file to push to

        Select repo
        Refresh Authorize more repos
        Select branch
        Select file
        Select branch
        Choose version(s) to push
        • Save a new version and push
        • Choose from existing versions
        Include title and tags
        Available push count

        Pull from GitHub

         
        File from GitHub
        File from HackMD

        GitHub Link Settings

        File linked

        Linked by
        File path
        Last synced branch
        Available push count

        Danger Zone

        Unlink
        You will no longer receive notification when GitHub file changes after unlink.

        Syncing

        Push failed

        Push successfully