Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doubt about patch embeddings #1

Open
zaocan666 opened this issue Jan 24, 2021 · 4 comments
Open

doubt about patch embeddings #1

zaocan666 opened this issue Jan 24, 2021 · 4 comments

Comments

@zaocan666
Copy link

Hi~awsome repo.
But I wonder if it is necessary to implement GELU and LayerNorm after linear layer to get patch embedding. Neither the ViT paper and code applies these layers.
What I mentioned is in Line 269 and 270 in /SETR/transformer_model.py

@920232796
Copy link
Owner

transformer 文章里面是使用了这两个操作,你可以去看下原文中的模型结构。 Attention is all you need.

@zaocan666
Copy link
Author

但ViT的文章和代码好像没有用

@920232796
Copy link
Owner

这个我也不清楚了 说实话我没看过那篇文章跟代码... 你可以去掉试试 我觉得影响不大应该。

@zaocan666
Copy link
Author

好的,多谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants