# Pay attention to MLPS ###### tags : `mlps` `self-attention` `transformer` ## :cat: Paper Info Conference : Year : 2021 Paper : [PDF](https://arxiv.org/pdf/2105.08050.pdf) Total Citation(Recent) : Refs : <br> ## :palm_tree: Abstract タスク : <br> ## :fireworks: Method <br> ## :bar_chart: Results <br> ## :ledger: Memo <br>