### book
- Chip Huyen's [ML Interview Book](https://huyenchip.com/ml-interviews-book/)
- [ML Interview Questions](https://github.com/andrewekhalel/MLQuestions)
### ml
- Kilian Weinberger's [Cornell CS47580 class](https://www.youtube.com/playlist?list=PLl8OlHZGYOQ7bkVbuRthEsaLr7bONzbXS) and [notes](https://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/). Most of the interview topics in Machine Learning are covered in this series.
- [ML from Scratch](https://dafriedman97.github.io/mlbook/content/introduction.html). To be read in parallel with CS47580.
### resources
- [ML Interview Resources](https://github.com/khangich/machine-learning-interview)
- [List of questions to revise](https://github.com/shafaypro/CrackingMachineLearningInterview)
- [More interview questions](https://github.com/aershov24/machine-learning-ds-interview-questions)
- [LinkedIn Skill Assessment](https://github.com/Ebazhanov/linkedin-skill-assessments-quizzes/blob/main/machine-learning/machine-learning-quiz.md) questions
#### topic-wise resources
- Raschka's [KNN notes](https://sebastianraschka.com/pdf/lecture-notes/stat479fs18/02_knn_notes.pdf), [video on heaps and priority queues](https://www.youtube.com/watch?v=HqPJF2L5h9U) (for time analysis)
- Scikit Learn's [clustering algorithms](https://scikit-learn.org/stable/modules/clustering.html)
- StatQuest's [XGBoost playlist](https://www.youtube.com/playlist?list=PLblh5JKOoLULU0irPgs1SnKO6wqVjKUsQ)
- Raschka's lectures on Boosting [1](https://www.youtube.com/watch?v=LxcGKNV5-p4&list=PLTKMiZHVd_2KyGirGEvKlniaWeLOHhUF3&index=43), [2](https://www.youtube.com/watch?v=zblsrxc7XpM&list=PLTKMiZHVd_2KyGirGEvKlniaWeLOHhUF3&index=43)
- [Blog on impurity measure in Decision Trees](https://ekamperi.github.io/machine%20learning/2021/04/13/gini-index-vs-entropy-decision-trees.html)
- Series of PCA blogs: [1](https://peterbloem.nl/blog/pca), [2](https://peterbloem.nl/blog/pca-2), [3](https://peterbloem.nl/blog/pca-3), [4](https://peterbloem.nl/blog/pca-4)
- [XGBoost vs LightGBM](https://bangdasun.github.io/2019/03/21/38-practical-comparison-xgboost-lightgbm/) blog
- Blogs on metrics used in ML: [1](https://shuzhanfan.github.io/2018/02/model-evaluation-metrics/), [2](https://kevalnagda.github.io/evaluation-metrics)
- [Linear Regression analysis](http://www.zstatistics.com/videos#/regression)
- [SVM math](https://shuzhanfan.github.io/2018/05/understanding-mathematics-behind-support-vector-machines/)
- [Expectation Maximization](https://ekamperi.github.io/mathematics/2021/07/03/expectation-maximization-part1.html)
- [Data imputation](https://fri-datascience.github.io/course_ids/handbook/missing-data.html)
- Google's crash course on [Recommendation Systems](https://developers.google.com/machine-learning/recommendation)
### dl (general + cnns)
- [CS231n Notes](https://cs231n.github.io/convolutional-networks/) on Convolutional Neural Networks. Most of the interview topics in CNNs are covered in this series.
- [UvA Notebooks](https://uvadlc-notebooks.readthedocs.io/en/latest/) and [Notebook Lectures](https://youtube.com/playlist?list=PLdlPlO1QhMiAkedeu0aJixfkknLRxk1nA&si=9jo8mXK5Re7M7jWy)
- [Sebastian Raschuka's DL Playlist](https://youtube.com/playlist?list=PLTKMiZHVd_2KJtIXOW0zFhFfBaJJilH51) a great playlist to review DL concepts quickly
#### topic-wise resources
- [Optimizers](https://ruder.io/optimizing-gradient-descent)
- [Weight initialization techniques](https://pouannes.github.io/blog/initialization/)
- [RNN](https://karpathy.github.io/2015/05/21/rnn-effectiveness/) blog by Karpathy
- [LSTM](https://colah.github.io/posts/2015-08-Understanding-LSTMs/) blog by Olah
- [Autoencoder](https://lilianweng.github.io/posts/2018-08-12-vae/)
- [Forward vs Reverse mode backprop](http://colah.github.io/posts/2015-08-Backprop/), [Backprop through conv layer](https://johnwlambert.github.io/conv-backprop/), Karpathy's [blog on understanding backprop](https://karpathy.medium.com/yes-you-should-understand-backprop-e2f06eab496b)
- [Theoretical motivations for DL](https://rinuboney.github.io/2015/10/18/theoretical-motivations-deep-learning.html)
- Chip Huyen's [System Design](https://huyenchip.com/machine-learning-systems-design/design-a-machine-learning-system.html#design-a-machine-learning-system-dwGQI5R)
- Karpathy's [Recipe](http://karpathy.github.io/2019/04/25/recipe/)
- Google's [Deep Learning Tuning Playbook](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook)
### projects/cv prep
- [Adversarial Machine Learning](https://adversarial-ml-tutorial.org/)
- [Annotated Papers](https://github.com/labmlai/annotated_deep_learning_paper_implementations)
- [NeRF Blog](https://dtransposed.github.io/blog/2022/08/06/NeRF/)
- [Frank Dellart's NeRF Explosion](https://dellaert.github.io/NeRF/)
- Lilian Weng's Object Detection series: [1](https://lilianweng.github.io/posts/2017-10-29-object-recognition-part-1/), [2](https://lilianweng.github.io/posts/2017-12-15-object-recognition-part-2/), [3](https://lilianweng.github.io/posts/2017-12-31-object-recognition-part-3/), [4](https://lilianweng.github.io/posts/2018-12-27-object-recognition-part-4/)
- [GANs](https://lilianweng.github.io/posts/2017-08-20-gan/), [Open GAN problems](https://distill.pub/2019/gan-open-problems/)
- [Attention and Transformers](https://peterbloem.nl/blog/transformers), [Attention mechanisms](https://lilianweng.github.io/posts/2018-06-24-attention/), [Transformer family](https://lilianweng.github.io/posts/2020-04-07-the-transformer-family/)
### applied
- [NumPy broadcasting rules](https://numpy.org/doc/stable/user/basics.broadcasting.html), [UFuncs](https://jakevdp.github.io/PythonDataScienceHandbook/02.03-computation-on-arrays-ufuncs.html)
- PyTorch [activation functions](https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial3/Activation_Functions.html), [initialization techniques](https://pytorch.org/docs/stable/nn.init.html)
- [Tensor Puzzles](https://github.com/srush/Tensor-Puzzles)
- [WTFPython](https://github.com/satwikkansal/wtfpython)
- [From Python to NumPy](https://www.labri.fr/perso/nrougier/from-python-to-numpy/)
- Chip Huyen's [Python is Cool](https://github.com/chiphuyen/python-is-cool)
- [Tensor Puzzles](https://github.com/srush/Tensor-Puzzles)
- [Autodiff Puzzles](https://github.com/srush/autodiff-puzzles)
### rand
- [Matrix calculus](https://explained.ai/matrix-calculus/index.html)
- [Cheatsheets for various courses](https://stanford.edu/~shervine/teaching/)
- [Interactive Linear Algebra book](https://textbooks.math.gatech.edu/ila/)
- [Visualizing optimization algorithms](http://louistiao.me/notes/visualizing-and-animating-optimization-algorithms-with-matplotlib/)
- [Research Advice](https://github.com/TheShadow29/research-advice-list)
### blogs
- Karpathy's [Blog](https://karpathy.github.io/)
- Chris Olah's [Blog](https://colah.github.io/)
- [distill.pub](https://distill.pub/)
- Stathis Kamperis' [Blog](https://ekamperi.github.io/archive.html)
- Lillian Weng's [Blog](https://lilianweng.github.io/posts/)
### hr
- [Questions to ask in interviews](https://jvns.ca/blog/2013/12/30/questions-im-asking-in-interviews/)
- [List of CS PhD advice](https://jedyang.com/post/list-of-awesome-cs-phd-application-advice/)
### additional dump, to read later
- [Interpretable Machine Learning](https://christophm.github.io/interpretable-ml-book/): Could be used for interview prep to answer questions on how to interpret classical ML models