Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add RMSNorm (part of parallel_attn_blocks) #448

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

lessw2020
Copy link
Contributor

Summary:
This PR adds RMSNorm.
It is used in parallel_attention_blocks but moving the earlier mono PR for that into smaller chunks for easier review and landing.

Test plan:
added two units tests:
test_rms_norm_fp32
test_rms_core_algo (compares F.norm version vs PR version)

Fixes:
Adds requested doc string to RMSNorm from earlier PR review.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 16, 2023
@codecov-commenter
Copy link

codecov-commenter commented Aug 16, 2023

Codecov Report

Patch coverage: 100.00% and project coverage change: +0.06% 🎉

Comparison is base (951a452) 69.11% compared to head (122f4b9) 69.17%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #448      +/-   ##
==========================================
+ Coverage   69.11%   69.17%   +0.06%     
==========================================
  Files         170      170              
  Lines       11524    11547      +23     
==========================================
+ Hits         7965     7988      +23     
  Misses       3559     3559              
Files Changed Coverage Δ
tests/modules/layers/test_normalizations.py 100.00% <100.00%> (ø)
torchmultimodal/modules/layers/normalizations.py 100.00% <100.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@rohan-varma rohan-varma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG! I would just absorb the fp32 type check test into the other one for brevity.

torchmultimodal/modules/layers/normalizations.py Outdated Show resolved Hide resolved
tests/modules/layers/test_normalizations.py Outdated Show resolved Hide resolved
tests/modules/layers/test_normalizations.py Outdated Show resolved Hide resolved
tests/modules/layers/test_normalizations.py Outdated Show resolved Hide resolved
@facebook-github-bot
Copy link
Contributor

@rohan-varma has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants