# CS 444: Topics in Computer Science (Deep Learning)
---
## News
- Description on [project proposal](https://hackmd.io/@sungjin/rJ9qf0V-8)
- Assignment 3 is posted (check out Canvas)
- Assignment 2 is posted (check out Canvas)
- ==[Assignment 1](https://hackmd.io/cb2hagPTTpCL8-LynmeE-w?view) is posted.==
- Sep 8 (Tuesday) class is cancelled due to holiday shift
- Aug 20: Syllabus Updated
## Canvas Site
- https://rutgers.instructure.com/courses/65637
---
## Overview
The principal purpose of this course is to introduce deep learning through a comparative presentation of methodology. The course mainly focuses on the fundamentals of deep neural networks and its applications to natural language processing, computer vision, and deep generative modeling. The course is intended for CS students with an applied mathematics orientation and knowledge on essential machine learning, and also for students in other programs (computer and electrical engineering, statistics, mathematics, psychology) who are interested in this topic.
The course will cover traditional neural networks architectures, including feed-forward neural networks, convolutional neural networks, recurrent neural networks as well as recent advances such as attention mechanisms, batch normalization, transformer, etc. An essential goal of the course is for the students to learn to build their own DL toolbox using PyTorch. The course will also cover some advanced topics like generative models and reinforcement learning if time allowed.
<!-- ## Topic Overview
- The course will cover basic concepts like feed-forward neural networks, convolutional neural networks, RNNs as well as recent advances like attention, batch normalization, transformer, style transfer, etc. An importnat part of the course is that students will not only learn these conceptually but will build their own DL toolbox of these modules to learn the concept completely. The course will also cover some advanced topics like generative models including topics like variational autoencoders and generative adversarial networks. The plan for the advanced topics can be adjusted though depending on the progress of the first part.
- A simlar course was taught at graduate-level [CS536](https://hackmd.io/@Tn97A1U0QG6gBtFPXRh4oQ/B1sZLO55r). The undergraduate course (CS445) will move slower and easier and may not cover some of the topics. -->
---
## Pre-Requisites
1. CS 440 or CS 439
1. I encourage students to take [CS 445: Introduction to Machine Learning](http://karlstratos.com/teaching/cs445fall20/cs445fall20.html) together with this course for maximal learning for the topic of model machine learning.
5. Students must be familiar with **python** programming. The homework will be based on [**PyTorch**](https://pytorch.org/), a python framework for deep learning. Thus, some experience of PyTorch will be helpful although the course will provide short intro lecture on using PyTorch. Students should set up the PyTorch programming environment as well.
- [Getting Started With Pytorch In Google Collab With Free GPU](https://hackernoon.com/getting-started-with-pytorch-in-google-collab-with-free-gpu-61a5c70b86a)
---
## Instructor & TA
- Instructor
- Sungjin Ahn (sungjin.ahn@cs.rutgers.edu) at CBIM-07
- A contact via email is usually responded within two business days.
- TA
- Yi-Fu Wu (yifuwu2@cs.rutgers.edu)
---
## Time and Location
- When: Tuesday and Thursday at 5:00pm - 6:20pm
- The course is remotely instructed via Zoom. Link for the Zoom meeting is posted on the course canvas site.
<!-- - Where: [BE-253](https://maps.rutgers.edu/#/?click=true&lat=40.52278946165586&lng=-74.43937818983225&selected=4145&sidebar=true&zoom=17) -->
---
## Office Hours
- Instructor
- Every Tuesday between 9 am - 10 am via Zoom. The Zoom link is posted on the course canvas site.
- As there can be multiple students waiting for the office hour, making an appointment is required (please send an email to the instructor in advance and schedule a slot within the office hour.)
- TA
- Every Wednesday 11am-12pm
<!-- - Instructor office hour: 10:00~11:00am on Thursday (CBIM 9)
- TA office hour: 4-5pm on Friday -->
<!-- ## Grading
- Homeworks (35%)
- There will be three programming assignments
- Midterm Exam (30%)
- Final Projects (35%) -->
---
## Technology Requirements
- A computer, mic/audio system, and access to the Internet to participate in the lectures via Zoom
- Please visit the Rutgers Student Tech Guide page for resources available to all students. If you do not have the appropriate technology for financial reasons, please email Dean of Students deanofstudents@echo.rutgers.edu for assistance. If you are facing other financial hardships, please visit the Office of Financial Aid at https://financialaid.rutgers.edu/.
---
## Computing Resources
- Rutgers iLab Servers: https://report.cs.rutgers.edu/nagiosnotes/iLab-machines.html
- Google Colab: https://colab.research.google.com/
---
## Required Books and Materials
- There is no required book for this course. The following books can be useful though as references on relevant topics.
- Reference Books
1. Dive into Deep Learning (https://d2l.ai/),
1. Deep Learning (DL), Goodfellow, Ian and Bengio, Yoshua and Courville, Aaron, MIT Press, 2016, ISBN: 9780262035613
1. Pattern Recognition and Machine Learning (PRML), Christopher C. Bishop, Springer, 2006, ISBN: 9780387310732
1. Natural Language Processing with Distributed Representations (NLP), Kyunghyun Cho, https://arxiv.org/abs/1511.07916
- Pytorch Tutorial
- [DEEP LEARNING WITH PYTORCH: A 60 MINUTE BLITZ](https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html)
- Reinforcement Learning
- [Reinforcement Learning (2nd)](http://incompleteideas.net/book/RLbook2020.pdf)
- [Lecture by David Silver](https://www.youtube.com/watch?v=2pWv7GOvuf0&list=PLqYmG7hTraZBiG_XpjnPrSNw-1XQaM_gB&index=2&t=0s)
<!-- DL & ML General
1. [Dive into Deep Learning (DDL)](https://d2l.ai/)
1. Deep Learning (DL), Goodfellow, Ian and Bengio, Yoshua and Courville, Aaron, MIT Press, 2016
3. [Pattern Recognition and Machine Learning (PRML)](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf), Christopher C. Bishop, Springer, 2006
4. [Natural Language Processing with Distributed Representations (NLP)](https://github.com/nyu-dl/NLP_DL_Lecture_Note/blob/master/lecture_note.pdf), Kyunghyun Cho -->
---
## Course Structure and Requirement
#### Grading
- Attendance (15%)
- Assignment 1 (15%)
- Assignment 2 (15%)
- Assignment 3 (15%)
- Assignment 4 (15%)
- Final Project (25%)
#### Attendance policy
- One absence is allowed and will not affect your score. From the second absence, a reasonable explanation with supporting evidence is required not to lose the attendance point.
- Each absence without a reasonable explanation will lose 3% of the score.
#### Check your [attendance sheet](https://docs.google.com/spreadsheets/d/1nmTOCTdemMRnGVm6dPJIm1pCWnyr7dbEBfDgkyiQcoA/edit?usp=sharing) here
#### Late submission will lose 2% every day
#### The final project
is a 1~3 person team project. The project consists of (1) proposal writing, (2) final presentation, and (3) final report. More details will be explained in class. For more details see [final project](https://hackmd.io/@Tn97A1U0QG6gBtFPXRh4oQ/rJ9qf0V-8)
---
## Assignments
- [Assignment 1](https://hackmd.io/cb2hagPTTpCL8-LynmeE-w?view)
---
## Lecture Slides
- available in Canvas
<!-- - [Link](https://drive.google.com/drive/folders/1mvXGhnnmEHPbXaZYzisTfSqJXuZcv-_O?usp=sharing) -->
---
## Schedule
Week 1
- 9/01 - Course Overview, ML Foundation
- 9/03 - ML Foundation
Week 2
- 9/08 - **No Class** (Due to Laborday Holiday Shift)
- 9/10 - ML Foundation (HW1 Release)
Week 3
- 9/15 - ML Foundation & Linear Models
- 9/17 - Linear Models
Week 4
- 9/22 - Multi-Layer Perceptrons (MLP)
- 9/24 - MLP, Activation Functions, Backpropagation, Vanishing/Exploding Gradinent
Week 5
- 9/29 - CNN
- 10/01 - CNN
Week 6
- 10/06 - Modern CNN
- 10/08 - Modern CNN
Week 7
- 10/13 - RNN
- 10/15 - RNN (Gradient Explosion/Vanishing, LSTM, GRU)
Week 8
- 10/20 - Attention, Transformer
- 10/22 - Attention, Transformer
Week 9
- 10/27 - BERT, GPT
- 10/29 - Optimization for Deep Learning (HW3)
Week 10
- 11/03 - Deep Reinforcement Learning (MDP, Dynamic Programming)
- 11/05 - Deep Reinforcement Learning (Dynamic Programming)
Week 11
- 11/10 - Deep Reinforcement Learning (Monte-Carlo, Temporal Difference, DQN)
- 11/12 - Deep Reinforcement Learning (Policy Gradient, MB-RL, HW4)
Week 12
- 11/17 - Deep Reinforcement Learning (Policy Gradient, MB-RL)
- 11/19 - Deep Generative Models - VAE
Week 13
- 11/24 - Deep Generative Models - VAE
- 11/26 - No Class (Thanksgiving Recess)
Week 14
- 12/01 - Deep Generative Models - VAE
- 12/03 - Deep Generative Models - GAN (End of Regular Class)
<!-- Week 15
- 12/08 - No Class
- 12/10 - No Class
-->
Week 16
- 12/17 - Final Presentation
- 12/18 - Final Report Due
<!-- ---
**The following schedule is tentative**
-->
<!-- 1. 1/21 - Course Overview, Basic Concepts for ML
2. 1/24 - Basic Concepts for ML
3. 1/28 - Linear Networks, Softmax, Multilayer Perceptrons, Regularization, Activation Functions
4. 1/30 - Backpropagation, Gradient Explosion, Vanishing, Dropout
5. 2/04 - Convolutional Networks
6. 2/06 - Modern Convolutional Networks (VGG, NiN, GoogLeNet, Batch Normalization, ResNet)
7. 2/11 - Modern Convolutional Networks, **HW1 release**
8. 2/13 - Advanced Convolutions (Deconv, Dilated Conv), RNNs
9. 2/18 - RNNs, Modern RNNs (LSTM, GRU)
10. 2/20 - Modern RNNs, Attention
- Attention: See also [DDL](https://d2l.ai/) Chapter 10 and [NLP](https://github.com/nyu-dl/NLP_DL_Lecture_Note/blob/master/lecture_note.pdf) Chapter 6.3
12. 2/25 - Seq2Seq Attention
14. 2/27 - Multi-Head Self Attention, Transformer, GPT-2
- [Transformer paper](https://arxiv.org/pdf/1706.03762.pdf), Blog: [Illustrated Transformer](http://jalammar.github.io/illustrated-transformer/)
16. 3/03 - Optimization for Deep Learning #1
17. 3/05 - Optimization for Deep Learning #2
18. 3/10 - Word Embedding
19. ~~3/12~~ - Spring Recession
~~3/17~~ - Spring Recession
~~3/19~~ - Spring Recession
19. 3/24 - BERT
- [BERT Paper](https://arxiv.org/pdf/1810.04805.pdf)
21. 3/26 - Deep Generative Models - VAE #1
22. 3/31 - HW1 Review and HW2 Q&A
23. 4/02 - Deep Generative Models - VAE #2
24. 4/07 - Deep Generative Models - VAE #3
<!-- Conditional VAE, VRNN, Gumbel-Softmax, DRAW -->
<!-- 26. 4/09 - Deep Generative Models - VAE #4
27. 4/14 - Deep Generative Models - GAN #1
<!-- GAN, InfoGAN, ConditionalGAN, CycleGAN, f-GAN, WGAN -->
<!-- 28. 4/16 - Deep Generative Models - GAN #2
29. 4/21 - Deep Generative Models - GAN #3
30. 4/23 - Deep Generative Models - GAN #4 (HW3 Release)
31. 4/28 - TBD
32. 4/30 - TBD -->
<!-- 33. (Remaining topics: Graph Neural Networks, Meta-Learning, Neural Turing Machine, DRL) -->
<!-- 34. 5/05 - **Final Presentation I**
35. 5/07 - **Final Presentation II**
36. 5/15 - HW3 Due --> --> -->