Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REF] Replace full backward hook with tensor hook on module output #63

Merged
merged 3 commits into from
Nov 8, 2023

Conversation

f-dangel
Copy link
Owner

@f-dangel f-dangel commented Nov 3, 2023

Resolves #56 and adds support for models with in-place activations.

See #56 for a detailed description. TL;DR: nn.Module.register_full_backward_hook is incompatible with inplace activations (pytorch/pytorch#61519). The suggested fix is to install a forward hook on the module which installs a tensor hook on the output.

singd/optim/optimizer.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@runame runame left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missed this PR for some reason, but LGTM.

test/optim/test_inplace_activations.py Outdated Show resolved Hide resolved
test/optim/test_inplace_activations.py Outdated Show resolved Hide resolved
@f-dangel f-dangel merged commit 5857af8 into main Nov 8, 2023
14 checks passed
@f-dangel f-dangel deleted the inplace-activations branch November 8, 2023 16:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bug: Full backward hook incompatible with in-place activations
2 participants