---
tags: Noah
---
:::info
Noah Nübling
Machine Learning with MATLAB
WS 2020/21
:::
# 01
## Problem 1
For OCR:
- Smaller file size
- Greater interoperability with other computer functions
- E.g searching for the text or copy pasting it
For Image:
- OCR May not work (bad handwriting)
- Some information might be lost (text color and highlights)
- Sometimes image is preferred by the receiver (maybe they want to draw on it)
## Problem 2
If a Neural Network assigns a very low probablity to the category which a datapoint is labeled with, then it's quite likely that the label is wrong (if the NN is trained and any good)
I'm not quite sure of the context under which the question is posed, so for clarity's sake, here is the context which I used to answer the question:
- We have a NN which is trying to assign a label to a datapoint. Let's say the labels are "Elephant", "Giraffe", and "Rhino" and the datapoints are pictures.
- We train the NN for a little while until it can quite confidently assign the categories to the pictures
- Now we give the NN a picture labeled "Elephant" but the NN assigns a probability of 0.0001% to the category "Elephant".
- This is a strong hint, that the picture we gave the NN is not actually of an Elephant, and has been labeled wrongly
## Problem 3
### Linear Regression function:
$L(w,b)$
$= \frac{1}{2m} \sum_{i=1}^{m}(h(x_i) - y_i)^2$
$= \frac{1}{2m} \sum_{i=1}^{m}(w\cdot x_i + b - y_i)^2$
#### Derivatives of Linear Regression function:
$\frac{d}{dw}(L(w,b))$
$= (\frac{1}{2m} \sum_{i=1}^{m}(w\cdot x_i + b - y_i)^2)'$
$= \frac{1}{2m} \sum_{i=1}^{m}x_i \cdot 2(w\cdot x_i + b - y_i)$
$= \frac{1}{m} \sum_{i=1}^{m}x_i (w\cdot x_i + b - y_i)$
$= \frac{1}{m} \sum_{i=1}^{m}x_i (h(x_i) - y_i)$
$\frac{d}{db}(L(w,b))$
$= (\frac{1}{2m} \sum_{i=1}^{m}(w\cdot x_i + b - y_i)^2)'$
$= \frac{1}{2m} \sum_{i=1}^{m}2(w\cdot x_i + b - y_i)$
$= \frac{1}{m} \sum_{i=1}^{m}(w\cdot x_i + b - y_i)$
$= \frac{1}{m} \sum_{i=1}^{m}(h(x_i) - y_i)$
### Logistic Regression function:
$L(w,b)$
$= -\frac{1}{m} \sum_{i=1}^{m} \left(y_i \cdot log(h(x_i)) + (1-y_i)log(1-h(x_i))\right)$
#### Derivatives of Logistic Regression function:
$\frac{d}{dw}(L(w,b))$
$= (-\frac{1}{m} \sum_{i=1}^{m} \left(y_i \cdot log(h(x_i)) + (1-y_i)log(1-h(x_i))\right))'$
$= ...$
$= \frac{1}{m} \sum_{i=1}^{m}x_i (h(x_i) - y_i)$
$\frac{d}{db}(L(w,b))$
$= (-\frac{1}{m} \sum_{i=1}^{m} \left(y_i \cdot log(h(x_i)) + (1-y_i)log(1-h(x_i))\right))'$
$= ...$
$= \frac{1}{m} \sum_{i=1}^{m}(h(x_i) - y_i)$
##### Mathematical Proofs:
###### Derivatives of h(x)

###### Derivatives of L(w,b)
