# RE 대회 준비
###### tags: `RE`

## 1. KLUE

## 2. 학습 파이프라인
- KLUE 데이터 불러오기 및 모델 불러오기
- 
- [허훈님 KLUE 튜토리얼 깃허브](https://github.com/Huffon/klue-transformers-tutorial)
- [KLUE Pretrained-Model](https://huggingface.co/klue)
- 모델 끝단 부분 구현 및 평가 하는 부분 참조해서 완성
- [R-BERT pytorch implementation](https://github.com/monologg/R-BERT)
- [How to Train a Joint Entities and Relation Extraction Classifier using BERT Transformer with spaCy 3](https://towardsdatascience.com/how-to-train-a-joint-entities-and-relation-extraction-classifier-using-bert-transformer-with-spacy-49eb08d91b5c)
- [BERT RE github](https://github.com/plkmo/BERT-Relation-Extraction)
- [How to use Bert for RE](https://towardsdatascience.com/bert-s-for-relation-extraction-in-nlp-2c7c3ab487c4)
- Wandb를 사용해서 실험 할 수 있는 환경 구현
## 3. SOTA Models
- BERT
- A typical Transformer-encoder algorithm, showin in Fig. 17.7, simply takes a pretrained encoder like BERT and adds a linear layer on top of the sentence repre- sentation (for example the BERT [CLS] token), a linear layer that is finetuned as a 1-of-N classifier to assign one of the 43 labels. The input to the BERT encoder is partially de-lexified; the subject and object entities are replaced in the input by their NER tags.
- SOTA에 속하는 RoBERTa와 SPANBERT는 SEP 토큰 없이 하나로 합침
- RoBERTa(https://arxiv.org/pdf/1907.11692.pdf)
- Dynamic Masking 사용
- 에폭 마다 masking을 다르게 준다.
- NSP을 뺐음
- Downstream 성능이 더 잘나온다는 실험을 함... 논란 중
- Data을 아주 아주 많이 학습
- Byte pair encoding
- 4가지 해서 SOTA 달성
- SPANBERT(https://arxiv.org/pdf/1907.10529.pdf)
## 4. Reference
- [Awesome relation extraction](https://github.com/roomylee/awesome-relation-extraction)
- [RE SOTA Mdoel](https://paperswithcode.com/task/relation-extraction)
- [BERT Papaer](https://arxiv.org/pdf/1810.04805.pdf)
- [ratsgo NLP Tutorial](https://ratsgo.github.io/nlpbook/docs/language_model/bert_gpt/)
- [한국어 버트 모델 KcELECTRA](https://github.com/Beomi/KcELECTRA)
- [NLP 전처리 Cheat Sheet](https://www.notion.so/b5685b85c4de4987926db999d0f5a8b6)
- [hugging face tutorial](https://huggingface.co/transformers/training.html)
- [관계추출 개괄과 RE알고리즘 과거부터 버트까지 개괄(영어, Stanford)](https://web.stanford.edu/~jurafsky/slp3/17.pdf)
- [호영님이 발굴한 KLUE RE tutorial](https://github.com/ainize-team/klue-re-workspace/blob/main/klue_re.ipynb)