--- tags: Noah --- :::info Noah Nübling Machine Learning with MATLAB WS 2020/21 ::: # 07 ## Question 1 The main advantages of using an adaptive learning rate is that it alleviates the Machine Learning Model Architect of the time consuming task of choosing an appropriate learning rate. Also the time to converge can be much reduced using an adaptive learning rate in some instances. But on the other hand, Stochastic Gradient Descent with a fixed learning rate is easier to implement, and there are scenarios where it performs as well or better than a more sophisticated algorithm using an adaptive learning rate. From what I could gather, there is no real rigorous theory behind when to choose which variant of a learning algorithm so in practise, Machine Learning Model Architects might have to experiment to see which is appropriate for their model. ## Question 2 We could use a Deep Learning Model containing cyclical kernels. These kernels would "wrap around" to the start of a dimension d of the data when a part of them reaches over the end of that dimension. Kind of like the snake in the classic Snake game on old Nokia phones. This would treat one dimensional data strings like a "ring" by treating the first and last elements as if they were adjacent. Just like traditional square kernels can make feature detection in images independent of the exact position of the feature in the image, cyclical kernels should make feature detection independent of position in the cycle.