awesome llm plaza: daily tracking all sorts of awesome topics of llm, e.g. llm for coding, robotics, reasoning, multimod etc.
-
Updated
Jul 4, 2024
awesome llm plaza: daily tracking all sorts of awesome topics of llm, e.g. llm for coding, robotics, reasoning, multimod etc.
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
Counting-Stars (★)
Official release of InternLM2.5 7B base and chat models. 1M context support
This is the official implementation of the paper "Needle In A Multimodal Haystack"
TriForce: Lossless Acceleration of Long Sequence Generation with Hierarchical Speculative Decoding
Transformers with Arbitrarily Large Context
This is the official repo of "QuickLLaMA: Query-aware Inference Acceleration for Large Language Models"
open-source code for paper: Retrieval Head Mechanistically Explains Long-Context Factuality
The official repo for "LLoCo: Learning Long Contexts Offline"
Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
Implementation of Infini-Transformer in Pytorch
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
"Found in the Middle: How Language Models Use Long Contexts Better via Plug-and-Play Positional Encoding" Zhenyu Zhang, Runjin Chen, Shiwei Liu, Zhewei Yao, Olatunji Ruwase, Beidi Chen, Xiaoxia Wu, Zhangyang Wang.
PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
LongAlign: A Recipe for Long Context Alignment Encompassing Data, Training, and Evaluation
The official implementation of "Ada-LEval: Evaluating long-context LLMs with length-adaptable benchmarks"
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
Add a description, image, and links to the long-context topic page so that developers can more easily learn about it.
To associate your repository with the long-context topic, visit your repo's landing page and select "manage topics."