-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ADAM optimizer #57
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #57 +/- ##
==========================================
+ Coverage 88.95% 89.12% +0.16%
==========================================
Files 47 49 +2
Lines 1902 1950 +48
==========================================
+ Hits 1692 1738 +46
- Misses 210 212 +2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
wow that was quick. Does Adam always produce a descent direction? |
Yes, but the descent direction may not be in the direction of the gradient, so then the descent strategy would change. I don't know if this strategy is good, just wanted to put it here for other to experiment with. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add the stochastic flag here too?
do we need the is_direction_descent
?
If Adam doesn't necessarily produce a descent direction, you need
in the class. |
No description provided.