Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/remove-numel - Remove numel is zero check from context manager exit method #920

Merged
merged 4 commits into from
May 14, 2024

Conversation

costigt-dev
Copy link
Collaborator

Reason for this PR

torch.empty and torch.numel don't work as originally expected although this code path was unlikely to be triggered in the original scenario this context manager was added to resolve

Changes Made in this PR

Removed the contents of the exit method in the load_quant_model context manager as the method to detect if the newly added biases were corrected won't work as intended.

Also replaced empty with zeros so the bias layer won't affect anything if unused.

Testing Summary

Risk Highlight

  • This PR includes code from another work (please detail).
  • This PR contains API-breaking changes.
  • This PR depends on work in another PR (please provide links/details).
  • This PR introduces new dependencies (please detail).
  • There are coverage gaps not covered by tests.
  • Documentation updates required in subsequent PR.

Checklist

  • Code comments added to any hard-to-understand areas, if applicable.
  • Changes generate no new warnings.
  • Updated any relevant tests, if applicable.
  • No conflicts with destination dev branch.
  • I reviewed my own code changes.
  • Initial CI/CD passing.
  • 1+ reviews given, and any review issues addressed and approved.
  • Post-review full CI/CD passing.

Future Work

@costigt-dev costigt-dev changed the title Fix/remove-numel Fix/remove-numel - Remove numel is zero check from context manager exit method Mar 22, 2024
@costigt-dev costigt-dev marked this pull request as ready for review March 26, 2024 09:13
@Giuseppe5
Copy link
Collaborator

We would still have an issue if the state dict the user is trying to load has no bias for certain modules, correct?

@costigt-dev
Copy link
Collaborator Author

We would still have an issue if the state dict the user is trying to load has no bias for certain modules, correct?

yes, i'm not sure of the best way to handle that

@Giuseppe5
Copy link
Collaborator

New change modifies the load state dict function withing a QuantWBIOL layer.

@Giuseppe5 Giuseppe5 requested a review from nickfraser May 13, 2024 15:33
@Giuseppe5 Giuseppe5 merged commit a1926f0 into Xilinx:dev May 14, 2024
336 of 347 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants