Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LayerNorm layer #88

Open
mawanda-jun opened this issue Nov 8, 2023 · 2 comments
Open

Add LayerNorm layer #88

mawanda-jun opened this issue Nov 8, 2023 · 2 comments

Comments

@mawanda-jun
Copy link

Hi there,
thank you for your groundbreaking work.

I'm trying to develop the rotation equivariant version of this neural network. I succesfully replaced every layer of the network, but there is the [torch.nn.LayerNorm](https://pytorch.org/docs/stable/generated/torch.nn.LayerNorm.html) layer that I wasn't able to replace (I put the InnerBatchNorm instead). Performance lays quite lower wrt its classical version (~15% lower on ImageNet classification) and I think this could be the culprit.

Is there a plan to support this norm layer? I'm willing to contribute, but I think I need guidance.

Cheers.

@psteinb
Copy link
Contributor

psteinb commented Nov 8, 2023

I am in the same boat, but with respect to InstanceNorm, see #69

@mawanda-jun
Copy link
Author

Definitely interesting, I feel like I need to linger over the theory a bit more to understand how to implement it by myself. Do you think it'd be useful to take a look at our layers together?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants