# Deep Learning Basics DL-1 - Perceptron and MLP: * https://www.youtube.com/watch?v=wl7gVvI-HuY&list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS&index=6 * https://www.youtube.com/watch?v=kObhWlqIeD8&list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS&index=7 * http://d2l.ai/chapter_multilayer-perceptrons/index.html - Activation functions: * https://medium.com/@snaily16/what-why-and-which-activation-functions-b2bf748c0441 * https://pytorch.org/docs/stable/nn.html#non-linear-activations-weighted-sum-nonlinearity - Neural Networks Gradient Descent Visualization: * https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi * https://www.youtube.com/watch?v=MfIjxPh6Pys&list=PLoROMvodv4rMiGQp3WXShtMGgzqpfVfbU&index=16 * https://www.youtube.com/watch?v=g6InpdhUblE&list=PL5-TkQAfAZFbzxjBHtzdVCWE0Zbhomg7r&index=5 - Backpropagation: * https://www.youtube.com/watch?v=dB-u77Y5a6A&list=PL5-TkQAfAZFbzxjBHtzdVCWE0Zbhomg7r&index=6 * https://www.youtube.com/watch?v=zUazLXZZA2U&list=PLoROMvodv4rMiGQp3WXShtMGgzqpfVfbU&index=17 - Optimization: * https://www.youtube.com/watch?v=YnQJTfbwBM8&list=PL5-TkQAfAZFbzxjBHtzdVCWE0Zbhomg7r&index=4 - Loss functions and a bit more of optimiztion: * https://www.youtube.com/watch?v=h7iBpEHGVNc * https://pytorch.org/docs/stable/nn.html#loss-functions As mentioned before, these concepts are going to be the foundation of your DL journey ahead, hence they must be rock solid. The amount of resources that you have to cover is huge this time. This is a patchwork of different courses etc, trying picking up the best of each. But because of this patchwork, there is a lot of overlapping hence you can skip sections that you deem repetitive. Please try to follow the resources in the order they are presented. And please DO NOT spend a lot of time on the PyTorch links, you have to just take a glance over them to get an idea about what all sorts of loss and activation functions could be out there, imo don't spend more than one hour on the PyTorch links. Please feel free to ask any and all doubts that may arise while going through these resources.