--- title: "Attention modole" --- ![](https://i.imgur.com/JHMoKF2.png) ### Tracking 1.https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6412970/ 2.https://papers.nips.cc/paper/2017/file/752d25a1f8dbfb2d656bac3094bfb81c-Paper.pdf 都是用[Hierarchical Attention](https://www.cs.cmu.edu/~./hovy/papers/16HLT-hierarchical-attention-networks.pdf)的方式(沒看) https://arxiv.org/pdf/1711.01124.pdf ### 其他 1.https://arxiv.org/pdf/1705.02544.pdf ![](https://i.imgur.com/VG37zOL.png) 從VGG 拉出不同大小的map做deConv到一樣的大小加後sigmoid出權重 2.https://openreview.net/pdf?id=HJlnC1rKPB ![](https://i.imgur.com/17XIZVR.png) MHSA 目前感覺跟SE的概念有點類似,但還在理解 3.https://arxiv.org/pdf/1807.06514.pdf ![](https://i.imgur.com/Ux8o8Hz.png) CBAM團隊另一個作品,可以當作是並聯版的CBAM