Skip to content

Simple class-based Python implementation of stochastic gradient methods - RADAM, ADAM, ADADELTA, and SGD (with momentum). Easy to plug into and use for Stochastic Gradient Variational Inference.

Notifications You must be signed in to change notification settings

robsalomone/simpleStochasticGradient

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

simpleStochasticGradient

Simple class-based Python implementation of stochastic gradient methods via NumPy - RADAM, ADAM, ADADELTA, and SGD (with momentum). Easy to plug into and use for Stochastic Gradient Variational Inference.

About

Simple class-based Python implementation of stochastic gradient methods - RADAM, ADAM, ADADELTA, and SGD (with momentum). Easy to plug into and use for Stochastic Gradient Variational Inference.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages