# CS410 Homework 7: Neural Networks
**Due Date: 10/30/2024**
**Need help?** Remember to check out [Edstem](https://edstem.org/us/courses/61309) and our website for TA assistance.
## Assignment Overview
Welcome to Homework 7! You'll start this assignment by building a `logistic regression` model, then learn how to calculate gradients automatically and explore neural networks using the `PyTorch` library, gaining hands-on experience with the deep learning pipeline. Here's what you'll learn and explore:
- Fundamentals of logistic regression
- Backpropagation with PyTorch
- Building basic neural networks with PyTorch
## PyTorch
**Important**: For this assignment, you will need to install PyTorch, a deep learning library. We avoided installing PyTorch during the HW0 setup so that the environment setup was straight forward and Torch can be a bit *finicky* to install.
If you run into issues installing torch, you can run this notebook (and any future notebooks that will run Torch) in Google Colab without any additional setup work.
To install torch, you should activate your environment and run: `pip install torch==2.3.0`
## Downloads
Like Homework 4, this assignment will take place in a Python notebook file.
Please click [here](https://classroom.github.com/a/OmLWz9cq) to download the assignment code.
:::info
If you get a message "Running cells with 'cs410_env (Python 3.10.6)' requires the ipykernel package," please install the package.
:::
## Handin
Your handin should contain:
- all modified files, including comments describing the logic of your algorithmic modifications
- a README, containing a brief overview of your implementation
### Gradescope
Submit your assignment via Gradescope.
To submit through GitHub, follow these commands:
1. `git add -A`
2. `git commit -m "commit message"`
3. `git push`
Now, you are ready to upload your repo to Gradescope.
*Tip*: If you are having difficulties submitting through GitHub, you may submit by zipping up your hw folder.
## Rubric
| Component | Points | Notes |
|-------------------|------|--------------------------------|
| 1.1 Logistic Regression | 50 | Points awarded for correct implementation of `initialize_parameters`, `sigmoid`, `forward`, `compute_log_loss`, `backward_propagation`, and `optimize`. |
| 1.2 Gradient Descent | 10 | Points awarded for correct implementation of gradient descent with logistic regression|
| MyMLP README Questions| 20 | Points awarded for responding to questions thoughtfully |
| MyMLP Implementation | 20 | Points awarded for meeting criteria, partial credit awarded for lower accuracies. |
:::success
Congrats on submitting your homework; Steve is proud of you!!


:::