Skip to content

Latest commit

 

History

History
22 lines (15 loc) · 500 Bytes

README.md

File metadata and controls

22 lines (15 loc) · 500 Bytes

Gradient Descent Optimizers

I have implemented some gradient descent optimizers for linear regression :

  • Vanilla gradient descent
  • Batch gradient Descent (with momentum, nesterov acceleration..)
  • Adagrad
  • RMSPROP
  • Adam ...

TODO :

  • Add level curve plot
  • KSGD
  • More generic implementation (polynomial)
  • Evolution of error on the fly
  • Newton method (logistic regression, but maybe in a new repo)