LLM Group, Institute for Advanced Algorithms Research, Shanghai
Popular repositories Loading
-
Awesome-Attention-Heads
Awesome-Attention-Heads PublicThe attention heads in the Transformer architecture possess a variety of capabilities. This is a carefully compiled list that summarizes the diverse functions of the attention heads.
Repositories
Showing 10 of 11 repositories
- Awesome-Attention-Heads Public
The attention heads in the Transformer architecture possess a variety of capabilities. This is a carefully compiled list that summarizes the diverse functions of the attention heads.
IAAR-Shanghai/Awesome-Attention-Heads’s past year of commit activity - ICSFSurvey Public
A comprehensive survey on Internal Consistency and Self-Feedback in Large Language Models, including theoretical frameworks, task classifications, evaluation methods, future research directions and more!
IAAR-Shanghai/ICSFSurvey’s past year of commit activity
Top languages
Loading…