Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improving robustness of log and exp, with proper special values o… #84

Merged
merged 1 commit into from
Jan 15, 2024

Conversation

balancap
Copy link
Contributor

@balancap balancap commented Jan 15, 2024

…utput.

Making sure that exp of 0 is 1 and log of 0 is -inf. Using a custom logsumexp in MNIST example until an additional scale propagation bug is solved.

NOTE: additional robustness means MNIST training converges when initialization scale > 1.

@balancap balancap force-pushed the improve-logsumexp-scaled-robustness branch from a2e77ba to 5ba39f8 Compare January 15, 2024 16:24
…utput.

Making sure that `exp` of `0` is `1` and `log` of `0` is `-inf`.
Using a custom `logsumexp` in MNIST example until an additional scale propagation
bug is solved.

NOTE: additional robustness means MNIST training converges when initialization scale > 1.
@balancap balancap force-pushed the improve-logsumexp-scaled-robustness branch from 5ba39f8 to e371883 Compare January 15, 2024 16:25
@balancap balancap merged commit 63ef3ba into main Jan 15, 2024
2 checks passed
@balancap balancap deleted the improve-logsumexp-scaled-robustness branch January 15, 2024 16:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant