---
tags: Personal meeting
title: NLP
date: '07, July, 2022'
---
## NLP recurrent
|paper|link|
|:-:|:-:|
|BERT|[BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/pdf/1810.04805.pdf)|
|Big Bird|[Big Bird: Transformers for Longer Sequences](https://arxiv.org/pdf/2007.14062.pdf)|
|RoBERTa|[RoBERTa: A Robustly Optimized BERT Pretraining Approach](https://arxiv.org/pdf/1907.11692.pdf)|
|LSTM|[Understanding LSTM Networks](https://colah.github.io/posts/2015-08-Understanding-LSTMs/)
|Transfomer|[Attention Is All You Need](https://arxiv.org/pdf/1706.03762.pdf)|
|Longformer|[Longformer: The Long-Document Transformer](https://arxiv.org/pdf/2004.05150.pdf)|
|Linformer|[Linformer: Self-Attention with Linear Complexity](https://arxiv.org/pdf/2006.04768.pdf)|
|Fastformer|[Fastformer: Additive Attention Can Be All You Need](https://arxiv.org/pdf/2108.09084.pdf)|