機器學習技法筆記
===
{%hackmd Hyaw8Gm6n %}
- [導覽頁面](https://hackmd.io/@ShawnNTU-CS/r1kn_Nqhh)
:::warning
此筆記用途為複習用、補充些許內容、以及敘述講義未寫的老師口述重點,還有個人對於某些內容的理解。
使用前必須先上過林軒田教授所開授的:
- [機器學習技法](https://www.coursera.org/learn/machine-learning-techniques)
或者在學習過程若有疑惑也可翻閱此筆記,也可以同時搭配講義內容。
或許有我所提供的觀點以幫助理解~
:::
機器學習基石筆記
---
- [轉接](https://hackmd.io/@ShawnNTU-CS/BJVnW49nn)
16 週課程
---
- [第一週 Linear Support Vector Machine](https://hackmd.io/@ShawnNTU-CS/ByNeNuI2n)
- [第二週 Dual Support Vector Machine](https://hackmd.io/@ShawnNTU-CS/HybC95v22)
- [第三週 Kernel Support Vector Machine](https://hackmd.io/@ShawnNTU-CS/r1kk9fFn2)
- [第四週 Soft-margin Support Vector Machine](https://hackmd.io/@ShawnNTU-CS/rJS8Wkchh)
- [第五週 Kernel Logistic Regression](https://hackmd.io/@ShawnNTU-CS/S1T4Q9Thn)
- [第六週 Support Vector Regression](https://hackmd.io/@ShawnNTU-CS/SkcCdKRnn)
- [第七週 Blending and Bagging](https://hackmd.io/@ShawnNTU-CS/BywPOgyan)
- [第八週 Adaptive Boosting](https://hackmd.io/@ShawnNTU-CS/SJvNkG16h)
- [第九週 Decision Tree](https://hackmd.io/@ShawnNTU-CS/SkCMtCf62)
- [第十週 Random Forest](https://hackmd.io/@ShawnNTU-CS/rkSqrG7an)
- [第十一週 Gradient Boosted Decision Tree](https://hackmd.io/@ShawnNTU-CS/H17m0L7a3)
- [第十二週 Neural Network](https://hackmd.io/@ShawnNTU-CS/BJuSNKEp2)
- [第十三週 Deep Learning](https://hackmd.io/@ShawnNTU-CS/HkIxZtv6h)
- [第十四週 Radial Basis Function Network](https://hackmd.io/@ShawnNTU-CS/S1hTRhOa2)
- [第十五週 Matrix Factorization](https://hackmd.io/@ShawnNTU-CS/H1JELhta3)
- [第十六週 Finale](https://hackmd.io/@ShawnNTU-CS/rJagqJ9an) <!-- checked -->
待填的坑
---
- [第四週 Soft Margin SVM](https://hackmd.io/oJS56aBkQZ-fR1dCWRK6Eg?both#alpha_ngt0--Support-Vector) 當全部的 Support Vector $\alpha$ 都等於 $C$ 的時候,$b$ 的處理方法。
- [第九週 Decision Tree](https://hackmd.io/@ShawnNTU-CS/SkCMtCf62)
- Regression 的決策樹要怎麼弄
- Missing feature 的一致意思
- [第十一週 Gradient Boosted Decision Tree](https://hackmd.io/PO8x4FRTQmabv5oR-WYYNg?both#Gradient-Boosted-Decision-Tree-GBDT) 不確定要怎麼做 GBDT 的採樣
- [第十二週 Neural Network](https://hackmd.io/@ShawnNTU-CS/BJuSNKEp2) scaled L2 的解釋
- [第十六週](https://hackmd.io/@ShawnNTU-CS/rJagqJ9an) 待我全都回顧後來個統整。
作業心得/紀錄/提供思路
---
- [HW1](https://hackmd.io/@ShawnNTU-CS/rJGBgd93h)
- [HW2](https://hackmd.io/@ShawnNTU-CS/HknDRUGp2)
- [HW3](https://hackmd.io/@ShawnNTU-CS/H192_08ah)
- [HW4](https://hackmd.io/@ShawnNTU-CS/rkHWd_7h3)
巧思紀錄
---
- [問題轉換](https://hackmd.io/iISbapICSbqs9aWnDVQpKQ?both#%E5%95%8F%E9%A1%8C%E7%9A%84%E8%BD%89%E6%8F%9B)
- [Lagrange Function](https://hackmd.io/I6TUxZjeSw270SMrwVDvUw?both#Lagrange-Function)
- [max err 的轉換](https://hackmd.io/@ShawnNTU-CS/SkcCdKRnn#Standard-Support-Vector-Regression-Primal)
{"title":"機器學習技法筆記","description":"機器學習技法筆記","contributors":"[{\"id\":\"67b57811-a6cb-40ee-a103-b61d8c13b1e2\",\"add\":2736,\"del\":95}]"}