# 2020-08-11 ## 1 [李宏毅課程](http://speech.ee.ntu.edu.tw/~tlkagk/courses_ML20.html) ### ML intro Regression Classification http://cs231n.stanford.edu/slides/ NN architecture & flow calculation Network Compression - BERT - ALBERT ### NLP & Human language - Token definition - Token Embedding - Speech intro - [speech ](http://ocw.aca.ntu.edu.tw/ntu-ocw/ocw/cou/104S204/7) - Speech2Vector - 語音與是否出現高/低頻率有關係(尋找哪些音頻有出現) - Voice Conversion - supervised - unsupervised - different people different language - one-shot learning - seq2seq variants - seq2seq task - sentiment classification - stance detection - post、replies => denied/acept - vearcity prediction(fake article) - post、replies、wiki => true/false - natural language inference - premise、hypothesis - =>contradiction、entailment、neutral - search engine - keyword、article - output: relevant - part-of-speech tagging - 詞性 - input length=output length - word segmentation - 斷詞,中文獨有 - 找斷詞點 - text style transfer - input lenght != output lenght - summarization - input lenght != output lenght - translation - input lenght != output lenght - unsupervised abstractive summarization - document -> model -> summary - GAN - Unsupervised translation - Question Answering - reading comprehension - task-oriented dialogue - =>> natural language generation - =>>> policy & state tracker - Natural Language Understanding - intent classification - slot filling - chatbot - 與人對聊 - input multi-seq=>output a sequence - 有方向性的決定回話方式(正向、負向) - coreference resolution(指代消解) - 代名詞指向誰 - parsing - self-supervised learning - 解釋supervised & self-supervised difference - self-supervised learning model intro - ELMo - BERT - ERNIE - Grover - Big Bird(Transformer for longer sequence) - General Language Understanding Evaluation(GLUE) - test model on different NLP task ability - extend: Super GLUE - BERT - Contextualized Word Embedding - BERT series - Word Embedding - w2v - Glove - fasttext - Contextualized Word Embedding - How to Train BERT - Masked LM - NSP - RoBERTa說NSP不work,拿掉比較好 - 改良: Sentence Order Predcition(SOP) - How to use BERT - sentiment analysis - NLI classification - Extraction QA - MASS/BART - seq2seq model - solve input length!=output length ## 2. torch https://youtu.be/z0uOq2wEGc - tensor calculation - fc layer - build model/module - loss - optimizer - gradient descent -
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up