We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The AdeMAMix optimizer is a simple modification of the Adam optimizer with a mixture of two EMAs to better take advantage of past gradients.
The paper has optax skeleton code which I could contribute if the maintainers deem this a good fit for the repo.
optax
The text was updated successfully, but these errors were encountered:
Hey, any update on it since the PR has been closed? :) thanks !
Sorry, something went wrong.
Apologies. I closed this because It was just easier to "begin again". I have a draft PR locally that I was going to push this week.
Awesome, good to know! Thx for your work 👌
contrib
Okay PR here
Successfully merging a pull request may close this issue.
The AdeMAMix optimizer is a simple modification of
the Adam optimizer with a mixture of two EMAs to better take advantage of past gradients.
The paper has
optax
skeleton code which I could contribute if the maintainers deem this a good fit for the repo.The text was updated successfully, but these errors were encountered: