# SVM (Support Vector Machine)
A Training Algorithm for Optimal Margin Classifiers(1992) [[paper link]](http://www.svms.org/training/BOGV92.pdf)
Phoebe Huang
Pytorch Tainan
2019/7/20
---
## Task Discription

Find decision function $D(\mathbb{x})$,

----
SVM Main Idea: Maximum Margin ${M}$

----
The decision functions $D(\mathbb{x})$ must be linear in their parameters but are not restricted to linear dependences of $\mathbb{x}$.

----
In the dual space the decision functions are of the form

The function $K$ is a predefined kernel

----
Provided that the expansion stated in equation 5 exists, equations 3 and 4 are **dual representations** of the same decision function and

First, the margin between the class boundary and the training patterns is formulated in the **direct space**. This problem description is then transformed into the **dual space** by means of the **Lagrangian**.
---
## In the Direct Space
In the direct space,


----
The distance between this hyperplane and pattern $\mathbb{x}$ is $\frac{ D(\mathbb{x})}{||\mathbb{w}||}$.
All data satisfying

----
The objective of the training algorithm is to find $\mathbb{w}$ that maximum $M$

<!-- ?? -->
The bound $M^{*}$ is attained for those patterns satisfying

----


----

---
## In the Dual Space

----


----


----


---
# 3 Properties of this Algorithm
## 3.1 Properties of the solution


{"metaMigratedAt":"2023-06-14T22:43:09.132Z","metaMigratedFrom":"YAML","title":"SVM_slide_20190720","breaks":true,"contributors":"[{\"id\":\"e24670f1-5289-4250-9cc9-ef3c6e321508\",\"add\":2775,\"del\":299}]"}