Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement details of Sobel Layer? #4

Open
Uthory opened this issue Jul 30, 2021 · 5 comments
Open

Implement details of Sobel Layer? #4

Uthory opened this issue Jul 30, 2021 · 5 comments

Comments

@Uthory
Copy link

Uthory commented Jul 30, 2021

Thanks for your work.
As the paper show, it consists four sublayer. But I wonder how the sobel result of x and y be fusioned together?
image
DOES THE CODE BELOW RIGHT?

def forward(self, x):
      sobel_x = F.conv2d(x, sobel_kernel_x, padding=1, bias=False)
      sobel_y = F.conv2d(y, sobel_kernel_y, padding=1, bias=False)
      sobel_rs = torch.pow(torch.pow(sobel_x, 2) + torch.pow(sobel_y, 2), 0.5)
      sobel_rs = F.normalize(x, p=2)
      sobel_rs = self.bn(sobel_rs)
      sobel_rs = F.sigmoid(sobel_rs)
      sobel_rs = x * sobel_rs
@Uthory
Copy link
Author

Uthory commented Aug 3, 2021

As the sobel conv is the fundamental operation of your ESB, I'm waiting for your response. Thanks in advance.

@dong03
Copy link
Owner

dong03 commented Aug 3, 2021

Almost there, except for the sequence of l2 norm and batchnorm2d, which is corrected in the lately version (see https://arxiv.org/abs/2104.06832).

Our implement sobel together with rest module are still under inner review.

@Uthory
Copy link
Author

Uthory commented Aug 4, 2021

Sorry, but I still have a question. What's the meaning of the L2 norm? As your latest version show that the l2norm is append behind the BN layer. But the output of sobel layer and bn is B, 1, H, W.
L2Norm will let the value all to 1?
Besides, how does the sobel result of x and y be fusioned. Note that sqrt(x^2, y^2) may cause gradient explosion.

@Chenxr1999
Copy link
Collaborator

In fact we apply L2 norm as the fusion function of x and y, thus the channel dimension to be 1:
sobel_rs = torch.sqrt(torch.pow(nn.BatchNorm2d(sobel_x), 2) + torch.pow(nn.BatchNorm2d(sobel_y), 2))
The code will be released soon.

@Uthory
Copy link
Author

Uthory commented Aug 11, 2021

hello there! I trained my model these days, and could you plz roughly tell me the final loss on CASIA_v2 ? I wanna know whether it is converged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants