0-1: Preface

Hi, I'm an undergraduate junior student from NTUEE when I start to write this note. I hope that my notes are not just for application but also covers the theoretical side of ML.

The following are my studying reference resources:

  • M. Mohri, A. Rostamizadeh, and A. Talwalkar, “Foundations of Machine Learning,” the MIT Press, 2012.
  • S. Shalev-Shwartz and S. Ben-David, “Understanding Machine Learning: From Theory to Algorithms,” Cambridge University Press, 2014.
  • Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani, “An Introduction to Statistical Learning : with Applications in R.,” New York :Springer, 2013.

ML Lecture Resources from NTU professors includes:

  • Hung-Yi Lee
  • I-Hsiang Wang
  • Hsuan-Tien Lin
  • Yun-Nung (Vivian) Chen

哈囉,目前我是台大電機系大三的學生。這是我自學機器學習(並在後面接續上課)所寫的ML筆記,希望可以從理論到實際應用的層面都能夠有完整的整理。

目前我正在參考以下教授的學習資料:

  • 機器學習(李宏毅) (他應該有名到大家都聽過吧)
  • 深度學習之應用(陳媪儂)
  • 機器學習基石/技法(林軒田)
  • 機器學習中的數學原理(王奕翔)

我後面大概不會寫到中文了吧(可能一些名詞會翻譯一下),還請各位讀者見諒XD

Outline of this note

Chapter 1 explores the theoretical foundation of ML, so called statistical learning theory. We'll explain the PAC learning framework, know why can computer learn stuffs, then present the learning gaurantee when the hypothesis is finite. If the hypothesis set is not finite, we'll introduce the VC theory and a more modern complexity mearsure, Rademacher complexity, which are powerful tools to show the learning guarantee of infinite hypothesis set.

tags: machine learning