--- tags: Plant Disease classfication,ResNet title: Personal meeting 2020/05/19 date: '19, May, 2020' --- ## Attention Attention機制源自於人類的視覺系統,人並不會將注意完整視野內的所有景象,而是將目光聚焦在某部分並將局部的訊息組合起來理解。  score function:找出應該注意的區域 alignment function:計算attention weight generate context vector function:計算輸出向量 ### soft & hard attention 出自 Show, attend and tell: Neural image caption generation with visual attention encoder:VGG decoder:LSTM  #### hard attention * Zt是context vector,St為decoder 第t 個時刻的attention 所關注的位置編號 目前理解:從st挑選出權重最大的向量(1),其他剃除(0)  #### soft attention 權重ati所扮演的角色是圖像區域ai在時刻t的輸入decoder的信息中的所佔的比例  ## 結論 NLP方面,由於soft attention模型光滑、可微,可利用反向傳播來進行end-to-end的訓練,在語意理解上被廣泛使用;而ResNest所使用的Split-Attention還在思考如何與傳統attention的概念連結 ## 參考資料 [[1]Show, attend and tell: Neural image caption generation with visual attention](https://arxiv.org/pdf/1502.03044.pdf)
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up