# 機器學習 ## Lecture 1:Introduction of Deep Learning - sigmoid function: - sigmoid 可以近似一個水平斜線水平的 function - piecewise linear 可以寫成 b + 一些水平斜線水平的 function - 一層用 sigmoid 的 NN 模型,可以模擬 piecewise linear function - ML ~ 尋找 function - 1.定義 function 和參數 - 2.定義 loss - 3.optimization - 論文/links - [Prob Gen Model](https://www.youtube.com/watch?v=fZAZUYEeIMg&ab_channel=Hung-yiLee) / [Logistic Regression](https://www.youtube.com/watch?v=hSXFuypLukA&ab_channel=Hung-yiLee) - [restricted Boltzmann machine](https://zh.wikipedia.org/wiki/%E5%8F%97%E9%99%90%E7%8E%BB%E5%B0%94%E5%85%B9%E6%9B%BC%E6%9C%BA) - [arxiv1811: Activation Functions: Comparison of Trends in Practice and Research for Deep Learning](https://arxiv.org/pdf/1811.03378.pdf) ## Lecture 2:What to do if my network fails to train - 訓練的情境 - 訓練上表現不好 -> 可以由淺入深增加模型彈性來判斷 - Model Bias: 模型不夠 powerful - Optimization: gd 還沒找到足夠好的參數 - 訓練表現好但測試不好 -> - Overfit: 訓練表現變好但測試表現變差 - 越 powerful 模型越容易 overfit - 增加資料量 - 限制模型 / early stop / regularization / dropout - Mismatch: 訓練資料和測試資料分布相異 - Optimization 的技巧 - 判斷是否 saddle point / local minimum - batch 更新參數 - Momentum - Adaptive learning rate - 分類問題的數學分析 - ## Lecture 3:Image as input ## Lecture 4:Sequence as input ## Lecture 5:Sequence to sequence ## Lecture 6:Generation ## Lecture 7:Self-supervied Learning ## Lecture 8:Auto-encoder / Anomaly Detection ## Lecture 9:Explainable AI ## Lecture 10:Attack ## Lecture 11:Adaptation ## Lecture 12:Reinforcement Learning ## Lecture 13:Network Compression ## Lecture 14:Life-long Learning ## Lecture 15:Meta Learning
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up