# Natural Language Prosecessing --- Alan Turing ![](https://hackmd.io/_uploads/HyOYhSjfT.png) ---- I propose to consider the question 'can machine think' -Alan Turing- ---- ![](https://hackmd.io/_uploads/rywR3rofp.jpg) ---- ![](https://hackmd.io/_uploads/H1hAZHiGp.png) --- ## pre-LLM --- ## SHRDLU ---- ![](https://hackmd.io/_uploads/r128_Ocfp.png) --- siri ---- 我不太清楚 ---- # 什麼是大語言模型 --- ## 多大 ![](https://hackmd.io/_uploads/HJQ1SJGzp.png) ---- 1. 能力: 參數愈多,能表示的愈多 2. 記憶: LLM沒有資料庫!! --- # 1-of-N Encoding ---- apple= 1 bird=2 cat=3 --- # token ---- 一字多義 ---- ![](https://hackmd.io/_uploads/S1nkXu5za.png) --- ## Contextualized Word Embedding ---- 每個token有不同的embedding 意思愈近的token embedding愈像 --- # ELMo ---- ![](https://hackmd.io/_uploads/BkxVjuqzp.png) ---- Embeddings from Language Model ---- 透過雙向的RNN考慮前後文 ---- 我想過過過兒過過的生活 ---- 過 != 過 --- ## Transformer ---- ![](https://hackmd.io/_uploads/HkASD1zzp.png) ---- encoder-decoder ---- 給你一串tensor 透過Attention機制 將原本的位於序列裡面的元素互相運算 --- 幾個出名的LLM model --- # Bert Bidirectional Encoder Representations from Transformers ---- ![](https://hackmd.io/_uploads/BkxcN8qGa.png) ---- encoder-only ---- 擅長上下文理解 ---- # Bidirectional? ---- ## decoder: non-directional ---- ## Task#1: Masked LM (MLM) ---- !克漏字 ---- Yuan Lu is an [MASK] person. ---- ![](https://hackmd.io/_uploads/ry4ZywcGa.png) ---- ### Task#2: Next Sentence Prediction (NSP) ---- 使用模型最後產生的[CLS] Token,將他傳入一層 Classification Layer 進行矩陣轉換產出維度為 2×1 的向量,並且再透過 Softmax 得到IsNextSequence 的機率 --- # GPT ---- Generative Pre-trained Transformer ---- decoder-only ---- ## Masked LM (MLM) ---- 單向的文字接龍 ---- ChatGPT ---- To be continue
{"title":"NLP","contributors":"[{\"id\":\"fd160699-cd06-45fd-abd0-c09edbc85980\",\"add\":151,\"del\":144},{\"id\":\"a3a2c0b4-1d07-4978-b6f4-5289569ca204\",\"add\":1847,\"del\":105}]"}
    200 views
   Owned this note