# ADL Lecture 3.2: Language Modeling 筆記 ###### tags: `NLP` {%youtube [p2e_riORjuU](https://www.youtube.com/watch?v=LheoxKjeop8&list=PLOAQYZPRn2V5_9qzD7_1TzADthNSBf8_z&index=10&ab_channel=VivianNTUMiuLab) %} ## :memo: Language Modeling - Goal: 估算一個句子每一個word出現的機率。  ### N-Gram Language Modeling - 將一句話切成好幾個windows,去估算每個windows的機率。   - 問題:若training data沒出現過,那此機率會變成0。 - 解法: 給予極小機率  :rocket: ### Neural Language Modeling   - 希望概念相似的word的vector越近越好。  > 問題: windows大小只能固定,跟真實狀況有差距。 ### Recurrent Neural Network * 想法: 考慮前面全部的字,並有時間順序。  > 所有前面出現過的word,都可以model進去。 --- ## :memo: Recurrent Neural Network ## :memo: RNN Applications
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up