# [CS5871] Module 4 checks
Related links:
- [How to check assignment (by Celine)](https://github.com/srush/miniTorch/blob/1e86c50412c3af0c402baf679074d9df39ff1df8/ta_assignment_checklist.md)
- [Assignment](https://minitorch.github.io/module4/module4/)
---
### Line numbers slightly different from srush/miniTorch
## 0. Setup
### setup.cfg:5 (optional)
```
[files] => [options]
minitorch
```
This change allows you to remove [] from setup.py.
### Changes for removing Scalar following Module 2 FAQ
* Remove uses of Scalar/ScalarFunction, following [this post](https://edstem.org/us/courses/24271/discussion/1847044) (Details below).
Remove the following:
* from optim.py: `from .scalar import Scalar`,
```
if hasattr(p.value, "derivative"): # remove
if p.value.derivative is not None: # remove
p.update(Scalar(p.value.data - self.lr * p.value.derivative)) # remove
elif hasattr(p.value, "grad"): => if hasattr(p.value, "grad"):
```
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/minitorch/optim.py
* from \_\_init\_\_.py: `from .scalar import *`
### 0D tensor indexing
* Can index into 0D tensor and call self.item(), but can't print. One line fix:
```
tensor_data.py:213
def index(self, index: Union[int, UserIndex]) -> int:
if isinstance(index, int):
aindex: Index = array([index])
if isinstance(index, tuple):
aindex = array(index)
# pretend 0-dim shape is 1-dim singleton
shape = self.shape
if len(shape) == 0 and len(aindex) != 0: # change this line
shape = (1,)
```
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/minitorch/tensor_data.py#L213
## Task 4.4 max
Need to call `tensor._ensure_tensor(dim)` for max:
```
# minitorch/nn.py:91
def max(input: Tensor, dim: int) -> Tensor:
return Max.apply(input, input._ensure_tensor(dim))
```
A simpler lambda does not pass typechecking.
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/minitorch/nn.py#L90
## Task 4.4 maxpool2d
With the previous change to max, we should pass max an int.
```
# minitorch/nn.py:151
return max(x, tensor([4]... ==> return max(x, 4).view(batch, ...
```
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/minitorch/nn.py#L151
## Task 4.5 Mnist classification
Change mnist directory string in `project/run_mnist_multiclass.py` to
```
# project/run_mnist_multiclass.py
mndata = MNIST("project/data/")
```
so that `python project/run_mnist_multiclass.py` works from the base directory.
## Task 4.5 Mnist classification logging
The original logging function doesn't give the denominator for correct.
```
# project/run_mnist_multiclass.py:103
def default_log_fn(epoch, total_loss, correct, total, losses, model):
print(f"Epoch {epoch} loss {total_loss} valid acc {correct}/{total}")
# project/run_mnist_multiclass.py:180
# insert BATCH to give denominator for accuracy
log_fn(epoch, total_loss, correct, BATCH, losses, model)
```
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/project/run_mnist_multiclass.py
## Tests pass
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/pull/1
# Streamlit
## No sentiment_interface
No file `project/sentiment_interface`.
Remove
```
# project/app.py:108,111
from sentiment_interface ...
PAGES["Module 4: Sentiment"] ...
```
https://github.com/Cornell-Tech-ML/mle-module-4-justinchiu/blob/master/project/app.py#L108
# Assignment writing
## Networks: No need to download mnist
Data is already in the `project/data` folder, and python-mnist library is in requirements.extra.txt.
##
## Task 4.4 minitorch.dropout
"randoom"
## Task 4.5 Sentiment accuracy
I got to 74% and 76% best valid accuracy, rather than 75%.
Change to: We expect everyone to get > 70% best validation accuracy, with the expectation of around 75%.
## Task 4.5 training logs
Ask for:
* Sentiment: train loss, train accuracy, valid accuracy (>70, ~75%)
* Mnist: train loss, valid accuracy (out of 16)
* Training logs for both as txt files in the repo
# Guide: Convolution
* "which prevents us"... (white space)
* "concenptually"
* Empty code cell above "Similar gradient calculation" (missing A)
* Empty code cell above "Codewise (might be"
* Empty code cell below "Very roughly" and code seems outside of code cell
# Guide: Pooling
* "You will implement as version" => implement a version
# Guide: Multiclass
* sigmoid link broken
* "As we saw in Module 1," the sigmoid function ... "aply" => apply
* math for step(x) broken
* math for ReLU broken
* The plot for ReLU' is ReLU
* Function Comparison section not rendering
* logsumexp link broken
* Binary Multiclass visualization not rendering