# Attention Is All You Need ###### tags : `transformer` `self-attention` ## :cat: Paper Info Conference : Year : 2018 Paper : [PDF](https://arxiv.org/pdf/1706.03762.pdf) Total Citation(Recent) : Refs : <br> ## :palm_tree: Abstract <br> ## :fireworks: Method ### Key-Query-Valueの関係 ![image](https://user-images.githubusercontent.com/38309191/121833214-d14e4e80-cd06-11eb-8f6a-03b35fff4023.png) <br> #### 1. 従来のアテンション ![image](https://user-images.githubusercontent.com/38309191/121833444-533e7780-cd07-11eb-9fc0-91875f46bb55.png) ![image](https://user-images.githubusercontent.com/38309191/121833456-5afe1c00-cd07-11eb-8e56-7cf9a9039609.png) <br> #### 2. Self-attention ![image](https://user-images.githubusercontent.com/38309191/121833859-42423600-cd08-11eb-87c4-bdf780487b97.png) ![image](https://user-images.githubusercontent.com/38309191/121833884-51c17f00-cd08-11eb-9f45-90b55c58751b.png) <br> ### Positional Encoding ![image](https://user-images.githubusercontent.com/38309191/121833999-977e4780-cd08-11eb-9b6e-5c8cdfe14019.png) ![image](https://user-images.githubusercontent.com/38309191/121834014-a2d17300-cd08-11eb-9333-f410522e9f02.png) <br> ### Multi Head Attention ![image](https://user-images.githubusercontent.com/38309191/121833979-8cc3b280-cd08-11eb-9ad9-0228c8742b78.png) <br> ## :bar_chart: Results <br> ## :ledger: Memo <br>