# Überblick Optimizer https://ruder.io/optimizing-gradient-descent/ https://johnchenresearch.github.io/demon/ ## ADAM Implementationen https://gluon.mxnet.io/chapter06_optimization/adam-scratch.html https://github.com/sagarvegad/Adam-optimizer/blob/master/Adam.py ## CUDA Python https://devblogs.nvidia.com/numba-python-cuda-acceleration/