--- title: "CBAM: Convolutional Block Attention Module" --- # CBAM: Convolutional Block Attention Module(ECCV2018) 這篇主要結合 Spatial & Channel Domain 的 Attention ### CBAM Block  ### **Channel attention**   跟SE-Net一樣用兩個FC後cancat在一起做sigmoid取得scale $r$一樣是16 #### 針對 Channel 的實驗 實驗證明Avg $+$ Max會比SE提出的單項pooling好  ### **Spatial attention**  將$F'$的channel(只有只對axis=[3])進行Maxpool跟Avgpool成兩個二維的Map再concat在一起做conv(7 $*$ 7) 最後sigmoid成第二個scale #### 針對 Spatial 的實驗  ### Attention Module 的順序影響  ### Classification results on ImageNet-1K  ### Grad-CAM visualization  
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up