Skip to content

I have implemented some gradient descent algorithms for linear regression

Notifications You must be signed in to change notification settings

beaupletga/Linear-Regression-Optimizers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Gradient Descent Optimizers

I have implemented some gradient descent optimizers for linear regression :

  • Vanilla gradient descent
  • Batch gradient Descent (with momentum, nesterov acceleration..)
  • Adagrad
  • RMSPROP
  • Adam ...

TODO :

  • Add level curve plot
  • KSGD
  • More generic implementation (polynomial)
  • Evolution of error on the fly
  • Newton method (logistic regression, but maybe in a new repo)

About

I have implemented some gradient descent algorithms for linear regression

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published