Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the effect of frozen layers on GMACs? #75

Open
ahmadmobeen opened this issue Jul 26, 2021 · 1 comment
Open

What is the effect of frozen layers on GMACs? #75

ahmadmobeen opened this issue Jul 26, 2021 · 1 comment
Labels
question Further information is requested

Comments

@ahmadmobeen
Copy link

The GMACs are the same regardless of any frozen layers.

All layers are trainable:
Computational complexity:       7.63 GMac
Number of parameters:           128.92 M

Only classifier is trainable:
Computational complexity:       7.63 GMac
Number of parameters:           155.69 k

In my understanding, if "param.requires_grad' is set to 'False' in some of the layers, those layers would not be computed however they would remain part of the graph.

So, in the calculation of GMACs, such layers should be excluded as they would not be computed during the training hence reducing the number of operations?

Please correct me if my understanding is wrong.

@sovrasov
Copy link
Owner

requires_grad prevents pytorch from computing gradients for particular parameters during training. This flag doesn't affect forward pass complexity, which is measured by ptflops. See pytorch docs for datails.

@sovrasov sovrasov added the question Further information is requested label Jul 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants