Skip to content
#

nesterov-accelerated-sgd

Here are 20 public repositories matching this topic...

Implemented optimization algorithms, including Momentum, AdaGrad, RMSProp, and Adam, from scratch using only NumPy in Python. Implemented the Broyden-Fletcher-Goldfarb-Shanno (BFGS) optimizer and conducted a comparative analysis of its results with those obtained using Adam.

  • Updated May 18, 2023
  • Jupyter Notebook

Digit recognition neural network using the MNIST dataset. Features include a full gui, convolution, pooling, momentum, nesterov momentum, RMSProp, batch normalization, and deep networks.

  • Updated Jun 8, 2020
  • C#

Repository with the submissions for the 'Fundamentals of Optimization' course, where techniques such as gradient descent and its variants are implemented. These include gradient descent with a fixed step size (alpha), Nesterov GD with a fixed step, GD with a decreasing step size, GD with diagonal scaling and fixed step size.

  • Updated Dec 19, 2023
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the nesterov-accelerated-sgd topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the nesterov-accelerated-sgd topic, visit your repo's landing page and select "manage topics."

Learn more