Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 506 Bytes

221004 MOAT.md

File metadata and controls

7 lines (4 loc) · 506 Bytes

https://arxiv.org/abs/2210.01820

MOAT: Alternating Mobile Convolution and Attention Brings Strong Vision Models (Chenglin Yang, Siyuan Qiao, Qihang Yu, Xiaoding Yuan, Yukun Zhu, Alan Yuille, Hartwig Adam, Liang-Chieh Chen)

inverted bottleneck cnn block + attention block이라는 약간 고전적인 맛도 나는 아이디어에 self attention을 window attention으로 바꾼다는 트릭을 결합해서 downstream task에 적용했네요. 스코어는 꽤 인상적입니다.

#backbone #transformer