-
Notifications
You must be signed in to change notification settings - Fork 308
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The number of parameters on BatchNormalization module #89
Comments
Hi @hello-friend1242954 |
Thank you for the reply! |
The number of parameters of each module is calculated by following code,
flops-counter.pytorch/ptflops/pytorch_engine.py
Lines 110 to 112 in 5f2a45f
I used this code on torch.nn.BatchNorm2d like this
import torch
bn = torch.nn.BatchNorm2d(10)
sum(p.numel() for p in bn.parameters() if p.requires_grad)
Last line returns 20, but torch.nn.BatchNorm2d also has running (moving) mean and variance as parameters, doesn't it?
so I thought the correct number of parameters on torch.nn.BatchNorm2d(10) is
the number of weight parameters = 10
the number of bias parameters = 10
the number of running mean parameters = 10
the number of running var parameters = 10
that is, 10 * 4 = 40.
so I'm appreciated if you explain this! thank you!
The text was updated successfully, but these errors were encountered: