tf_optimizers icon indicating copy to clipboard operation
tf_optimizers copied to clipboard

Optimizers in tensorflow from scratch

Optimizers in tensorflow from scratch. See example.py and optimizers.py.

Optimizers

  • Stochastic Gradient descent
  • Stochastic Gradient descent with gradient clipping
  • Momentum
  • Nesterov momentum
  • Adagrad
  • Adadelta
  • RMSProp
  • Adam
  • Adamax
  • SMORMS3

References

  • An overview of gradient descent optimization algorithms: http://sebastianruder.com/optimizing-gradient-descent/
  • Chainer Optimizers: http://docs.chainer.org/en/stable/reference/optimizers.html
  • SMORMS3: http://sifter.org/~simon/journal/20150420.html
  • AdaGrad, RMSProp, AdaDelta, Adam, SMORMS3 (in japanese): http://qiita.com/skitaoka/items/e6afbe238cd69c899b2a