-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there any particular reason for puting Blurpool before the skip-connection layers? #51
Comments
Hi, I think you're referring to this section: https://github.com/adobe/antialiased-cnns/blob/master/antialiased_cnns/resnet.py#L147-L148, which makes it look like there's Your understanding is correct. It should be conv first. There is actually a conv before it, which was changed from stride 2 to stride 1: https://github.com/adobe/antialiased-cnns/blob/master/antialiased_cnns/resnet.py#L142. So overall, it is |
Thank you for your reply! I understand it is antialiased-cnns/antialiased_cnns/resnet.py Lines 252 to 253 in b27a34a
conv(stride1) .
|
Got it, in this case, because there's no non-linearity, the 1x1 conv and NxN blur are interchangeable. It's cheaper to do the blur --> stride --> conv rather than conv --> blur --> stride |
Hi. Thanks for your brilliant work on solving the antialiasing problem. I find in the implementation of Antialiased-ResNet50, the order to apply the antialiased layers in the skip connections is different from the one in the main pathway.
The original skip connection is:
...->conv(stride=2)->...
Now it is:
...blurpool(stride=2)->conv(stride=1)->...
But from my understanding, it should be:
...->conv(stride=1)->blurpool(stride=2)->...
And in this way, it is identical to how you deal with the Bottleneck.conv2 and Bottleneck.conv3.
Is there any reason the calculation order is inverse in the skip connections?
The text was updated successfully, but these errors were encountered: