AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
-
Updated
Jan 13, 2021 - Python
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
Pytorch library to test optimizers by visualizing how they descend on a your images. You can draw your own custom loss landscape and see what different optimizers do.
AeroAttention is an innovative, quantum-enhanced attention mechanism designed for transformer models. It offers optimized memory usage and accelerated computations, enabling scalable and efficient training for advanced neural network architectures
Learning about the algorithms used in machine learning, along with techniques for training and testing models.
AixrOptima is an artificial intelligence startup that speeds up the process process by integrating quantum circuits.
Training the fully connected neural network (FCNN) using different optimizers for the backpropagation algorithm and compare the number of epochs that it takes for convergence along with their classification performance. Also building an autoencoder to obtain the hidden representation and use it for classification.
Generating a TensorFlow model that predicts values in a sinewave
COVID-19-Xray-Dataset Deep Learning using 3 models
Optimizers for TensorFlow and Keras.
The Bird Species Classifier is an application built using a Convolutional Neural Network (CNN) to classify images of birds into one of 525 different species. It allows users to upload an image of a bird and receive a prediction of the bird species. Along with analysing the performance of various optimising algorithms.
A NumPy based Neural Network Package Implementation
Unofficial implementation of the Adan optimizer with Schedule-Free
En el siguiente repositorio encontrarás material relacionado con optimizadores usados en metodologías de aprendizaje de maquina. In the following repository you'll find examples of optimizers used in machine learning methods
Introduction to the Adam Optimizer with examples
Experimented with and compared DFW neural network optimizer with SGD and ADAM on both vision and language tasks
Add a description, image, and links to the optimizer-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the optimizer-algorithms topic, visit your repo's landing page and select "manage topics."