Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 343 Bytes

210505 Beyond Self-attention.md

File metadata and controls

7 lines (4 loc) · 343 Bytes

https://arxiv.org/abs/2105.02358

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks (Meng-Hao Guo, Zheng-Ning Liu, Tai-Jiang Mu, Shi-Min Hu)

perceiver스럽게 memory bank/latent에 대한 attention으로 self attention을 대체. 이쪽이 꽤 유망한 접근인 것 같긴 합니다.

#efficient_attention