# Problem 1 - Group B
## Presentation
## a)
- The main intuition is that if we have a large margin, we are far away from the decision boundary, hence we make more **confident** predictions.

- Another intuition, based on **bias-variance** tradeoff.
## b)
- Bound should get tighter as we increase the margin (and the number of data points $n$).
## c)
- The term $$\frac{\gamma}{B}$$ makes sense because the margin depends directly on how large the weights can be in $$\min_{i} y_i w^T x_i.$$ If gamma is not zero, we can multiply $w$ by any $\alpha > 1$ and get a larger margin.
- Similar argument for $D$
- For tighter bounds on input and weights, we can get a tighter bound on generalization error.
## Scratchpad for collaboration