Code and documentation to train Stanford's Alpaca models, and generate the data.
-
Updated
Jul 17, 2024 - Python
Code and documentation to train Stanford's Alpaca models, and generate the data.
✨✨Latest Advances on Multimodal Large Language Models
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Must-read Papers on LLM Agents.
An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
An Open-sourced Knowledgable Large Language Model Framework.
A collection of open-source dataset to train instruction-following LLMs (ChatGPT,LLaMA,Alpaca)
A simulation framework for RLHF and alternatives. Develop your RLHF method without collecting human data.
PhoGPT: Generative Pre-training for Vietnamese (2023)
Reading list of Instruction-tuning. A trend starts from Natrural-Instruction (ACL 2022), FLAN (ICLR 2022) and T0 (ICLR 2022).
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
A collection of ChatGPT and GPT-3.5 instruction-based prompts for generating and classifying text.
[NeurIPS'23] "MagicBrush: A Manually Annotated Dataset for Instruction-Guided Image Editing".
[ICLR 2024] Mol-Instructions: A Large-Scale Biomolecular Instruction Dataset for Large Language Models
BigCodeBench: Benchmarking Code Generation Towards AGI
[NeurIPS'24 Spotlight] EVE: Encoder-Free Vision-Language Models
Code for "Lion: Adversarial Distillation of Proprietary Large Language Models (EMNLP 2023)"
Finetune LLaMA-7B with Chinese instruction datasets
EditWorld: Simulating World Dynamics for Instruction-Following Image Editing
WangChanGLM 🐘 - The Multilingual Instruction-Following Model
Add a description, image, and links to the instruction-following topic page so that developers can more easily learn about it.
To associate your repository with the instruction-following topic, visit your repo's landing page and select "manage topics."