Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

detach output from teacher #100

Open
wants to merge 1 commit into
base: 0.x
Choose a base branch
from
Open

detach output from teacher #100

wants to merge 1 commit into from

Conversation

twmht
Copy link
Contributor

@twmht twmht commented Mar 2, 2022

When using Self Distiller with Channel Wise Distill, there would be a backward exception due to the tensor from teacher is not detached.

The PR fixed this bug.

@twmht
Copy link
Contributor Author

twmht commented Mar 2, 2022

Or we can detach the tensor in cwd loss (https://github.com/open-mmlab/mmrazor/blob/master/mmrazor/models/losses/cwd.py) , what do you think?

@twmht twmht closed this Apr 15, 2022
@pppppM
Copy link
Collaborator

pppppM commented Apr 15, 2022

I'm so sorry, I haven't found this PR due to my negligence.

There are some users who will train both teacher and student at the same time.
Maybe adding a flag in SelfDistiller is better.
In SingleTeacherDistiller, use teacher_trainable to control.

if self.teacher_trainable:
output = self.teacher(**data)
else:
with torch.no_grad():
output = self.teacher(**data)

@pppppM pppppM self-assigned this Apr 15, 2022
@pppppM
Copy link
Collaborator

pppppM commented Apr 15, 2022

Would you like to reopen this pr and complete this feature together?

@twmht
Copy link
Contributor Author

twmht commented Apr 15, 2022

yup it would be better.

@twmht twmht reopened this Apr 15, 2022
@codecov
Copy link

codecov bot commented Apr 18, 2022

Codecov Report

Merging #100 (c19a8ec) into master (0dd407a) will increase coverage by 9.72%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #100      +/-   ##
==========================================
+ Coverage   56.58%   66.31%   +9.72%     
==========================================
  Files          83       92       +9     
  Lines        2932     3369     +437     
  Branches      540      613      +73     
==========================================
+ Hits         1659     2234     +575     
+ Misses       1197     1033     -164     
- Partials       76      102      +26     
Flag Coverage Δ
unittests 66.28% <100.00%> (+9.69%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmrazor/models/distillers/self_distiller.py 100.00% <100.00%> (ø)
mmrazor/utils/__init__.py 100.00% <0.00%> (ø)
mmrazor/models/ops/__init__.py 100.00% <0.00%> (ø)
mmrazor/models/losses/__init__.py 100.00% <0.00%> (ø)
mmrazor/models/ops/darts_series.py 81.72% <0.00%> (ø)
...els/architectures/components/backbones/__init__.py 100.00% <0.00%> (ø)
.../architectures/components/heads/no_bias_fc_head.py 0.00% <0.00%> (ø)
...chitectures/components/backbones/darts_backbone.py 88.63% <0.00%> (ø)
mmrazor/models/ops/mobilenet_series.py 95.65% <0.00%> (ø)
mmrazor/apis/mmdet/inference.py 51.51% <0.00%> (ø)
... and 47 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0dd407a...c19a8ec. Read the comment docs.

humu789 pushed a commit to humu789/mmrazor that referenced this pull request Feb 13, 2023
* align mmedit static cfg

* add for test

* update requirments

* add dependencies from mmlab

* change name

* lower thresh for interrogate at first

* update test

* update to skip

* Move import tensorrt

* Move import statement

Co-authored-by: SingleZombie <singlezombie@163.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants