# deepsense.ai interview - Michał Zobniów 2021 ## Im więcej * tym mniej ważne ### Pure ML 1. Supervised Learning - Unsupervised Learning 2. K-nearest neighbors 3. Linear Regression 5. High Bias - High Variance 6. Regularization 7. Ridge Regression 8. Quantile regression*** 9. Parameters - Hyperparameters 10. Statistical inference*** 11. Naive Bayes Classifier 12. Logistic Regression 14. Logistic Regression for multiclass classification 15. Huber - pseudo-Huber Loss* 16. A General and Adaptive Robust Loss Function*** 17. Feature Selection * Scoring Based Methods * Wrapper Methods * LASSO * LARS** 18. Decision Trees * Purity Criterions * Categorical - Numerical splits * Dealing with Missing values * DT for regression * Pruning 19. Classifier Bagging 20. Random Forest 21. Boosting 22. AdaBoost 23. XGBoost 25. Kernels 26. SVM 27. Simplified PAC Theory**** 28. K-means 29. Online K-means 30. Kohonen Maps**** 31. Gaussian Mixture Models and EM**** 32. Principal component analysis 33. Probabilistic Graphical Models**** 34. Unbalanced data ### NN 1. Gradient descent 2. Batch, Mini Batch & Stochastic Gradient Descent 3. Activation functions 4. SGD 5. R-Prop and RMSProp 6. ADAM 7. NN Weight initialization 8. Early Stopping 9. Polyak weigth averaging* 10. Data Augmentation 11. Norm Constraints* 12. L2 (weight decay) regularization 13. Dropout 14. Label smoothing 15. BatchNorm 16. ConvNets (Pooling, Dilated convolutions**, Padding, itp) 17. Resnets (ResidualConnetions) 18. RNN 19. Vanishing and Exploding Gradients 20. LSTM ### NN Advanced 1. GoogleNet** 2. EfficientNet 3. GRU 4. Attention 5. Transformers (BERT ELMO ITP) 7. MaskRCNN 8. Object detection 9. Semantic Segmentation 10. Instance Segmentation 11. YOLO 12. GANs 13. UNETs 14. Autoencoders 15. Variational autoencoder**
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up