# Überblick Optimizer https://ruder.io/optimizing-gradient-descent/ https://johnchenresearch.github.io/demon/ ## ADAM Implementationen https://gluon.mxnet.io/chapter06_optimization/adam-scratch.html https://github.com/sagarvegad/Adam-optimizer/blob/master/Adam.py ## CUDA Python https://devblogs.nvidia.com/numba-python-cuda-acceleration/
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up