sargx digital garden

Home

❯

Computer $h1t

❯

AI & Machine Learning

❯

Linear Models

❯

Gradient Descent

Gradient Descent

Feb 17, 20261 min read

https://jeremy9959.net/Math-3094-Spring-2021/published_notes/notes/GD.html

Gradient descent optimization algorithms:

  • Momentum
  • Nesterov accelerated gradient
  • Adagrad
  • Adadelta
  • RMSprop
  • Adam
  • AdaMax
  • Nadam
  • AMSGrad

https://rasbt.github.io/mlxtend/user_guide/general_concepts/gradient-optimization/

https://ruder.io/optimizing-gradient-descent/

https://johnchenresearch.github.io/demon/


🌱 Back to Garden


Graph View

Backlinks

  • Linear Models

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community