--- title: "Attention modole" ---  ### Tracking 1.https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6412970/ 2.https://papers.nips.cc/paper/2017/file/752d25a1f8dbfb2d656bac3094bfb81c-Paper.pdf 都是用[Hierarchical Attention](https://www.cs.cmu.edu/~./hovy/papers/16HLT-hierarchical-attention-networks.pdf)的方式(沒看) https://arxiv.org/pdf/1711.01124.pdf ### 其他 1.https://arxiv.org/pdf/1705.02544.pdf  從VGG 拉出不同大小的map做deConv到一樣的大小加後sigmoid出權重 2.https://openreview.net/pdf?id=HJlnC1rKPB  MHSA 目前感覺跟SE的概念有點類似,但還在理解 3.https://arxiv.org/pdf/1807.06514.pdf  CBAM團隊另一個作品,可以當作是並聯版的CBAM
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up